Nov 29 14:28:21 crc systemd[1]: Starting Kubernetes Kubelet... Nov 29 14:28:21 crc restorecon[4684]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 14:28:21 crc restorecon[4684]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 14:28:21 crc restorecon[4684]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 29 14:28:22 crc kubenswrapper[4907]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 29 14:28:22 crc kubenswrapper[4907]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 29 14:28:22 crc kubenswrapper[4907]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 29 14:28:22 crc kubenswrapper[4907]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 29 14:28:22 crc kubenswrapper[4907]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 29 14:28:22 crc kubenswrapper[4907]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.239659 4907 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.248376 4907 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.248632 4907 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.248764 4907 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.248868 4907 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.248993 4907 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.249100 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.249214 4907 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.249325 4907 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.249417 4907 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.249558 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.249653 4907 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.249744 4907 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.249833 4907 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.249939 4907 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.250031 4907 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.250121 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.250225 4907 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.250317 4907 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.250435 4907 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.250562 4907 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.250669 4907 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.250761 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.250850 4907 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.250942 4907 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.251030 4907 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.251117 4907 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.251204 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.251301 4907 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.251392 4907 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.251511 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.251611 4907 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.251706 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.251796 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.251905 4907 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.252008 4907 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.252100 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.252190 4907 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.252279 4907 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.252366 4907 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.252488 4907 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.252585 4907 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.252691 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.252784 4907 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.252882 4907 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.252982 4907 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.253078 4907 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.253171 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.253260 4907 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.253358 4907 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.253520 4907 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.253638 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.253734 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.253823 4907 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.253911 4907 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.254000 4907 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.254099 4907 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.254190 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.254286 4907 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.254380 4907 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.254540 4907 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.254638 4907 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.254728 4907 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.254833 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.254928 4907 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.255026 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.255119 4907 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.255209 4907 feature_gate.go:330] unrecognized feature gate: Example Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.255298 4907 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.255387 4907 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.255522 4907 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.255620 4907 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.256150 4907 flags.go:64] FLAG: --address="0.0.0.0" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.256284 4907 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.256398 4907 flags.go:64] FLAG: --anonymous-auth="true" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.256535 4907 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.256636 4907 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.256749 4907 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.256851 4907 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.256945 4907 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.257038 4907 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.257134 4907 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.257227 4907 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.257337 4907 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.257497 4907 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.257602 4907 flags.go:64] FLAG: --cgroup-root="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.257696 4907 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.257788 4907 flags.go:64] FLAG: --client-ca-file="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.257879 4907 flags.go:64] FLAG: --cloud-config="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.257970 4907 flags.go:64] FLAG: --cloud-provider="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.258076 4907 flags.go:64] FLAG: --cluster-dns="[]" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.258197 4907 flags.go:64] FLAG: --cluster-domain="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.258299 4907 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.258395 4907 flags.go:64] FLAG: --config-dir="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.258565 4907 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.258671 4907 flags.go:64] FLAG: --container-log-max-files="5" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.258808 4907 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.258902 4907 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.259014 4907 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.259112 4907 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.259204 4907 flags.go:64] FLAG: --contention-profiling="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.259296 4907 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.259386 4907 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.259527 4907 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.259628 4907 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.259723 4907 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.259975 4907 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.260088 4907 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.260193 4907 flags.go:64] FLAG: --enable-load-reader="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.260288 4907 flags.go:64] FLAG: --enable-server="true" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.260379 4907 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.260532 4907 flags.go:64] FLAG: --event-burst="100" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.260671 4907 flags.go:64] FLAG: --event-qps="50" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.260771 4907 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.260874 4907 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.260969 4907 flags.go:64] FLAG: --eviction-hard="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.261066 4907 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.261159 4907 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.261252 4907 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.261380 4907 flags.go:64] FLAG: --eviction-soft="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.261542 4907 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.261643 4907 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.261754 4907 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.261851 4907 flags.go:64] FLAG: --experimental-mounter-path="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.261944 4907 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.262046 4907 flags.go:64] FLAG: --fail-swap-on="true" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.262141 4907 flags.go:64] FLAG: --feature-gates="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.262250 4907 flags.go:64] FLAG: --file-check-frequency="20s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.262345 4907 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.262488 4907 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.262617 4907 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.262716 4907 flags.go:64] FLAG: --healthz-port="10248" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.262812 4907 flags.go:64] FLAG: --help="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.262907 4907 flags.go:64] FLAG: --hostname-override="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.263001 4907 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.263095 4907 flags.go:64] FLAG: --http-check-frequency="20s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.263207 4907 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.263305 4907 flags.go:64] FLAG: --image-credential-provider-config="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.263400 4907 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.263543 4907 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.263644 4907 flags.go:64] FLAG: --image-service-endpoint="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.263739 4907 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.263832 4907 flags.go:64] FLAG: --kube-api-burst="100" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.263938 4907 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.264035 4907 flags.go:64] FLAG: --kube-api-qps="50" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.264135 4907 flags.go:64] FLAG: --kube-reserved="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.264230 4907 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.264324 4907 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.264419 4907 flags.go:64] FLAG: --kubelet-cgroups="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.264576 4907 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.264687 4907 flags.go:64] FLAG: --lock-file="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.264792 4907 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.264888 4907 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.264983 4907 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.265084 4907 flags.go:64] FLAG: --log-json-split-stream="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.265196 4907 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.265295 4907 flags.go:64] FLAG: --log-text-split-stream="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.265398 4907 flags.go:64] FLAG: --logging-format="text" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.265581 4907 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.265693 4907 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.265790 4907 flags.go:64] FLAG: --manifest-url="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.265884 4907 flags.go:64] FLAG: --manifest-url-header="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.265992 4907 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.266089 4907 flags.go:64] FLAG: --max-open-files="1000000" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.266186 4907 flags.go:64] FLAG: --max-pods="110" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.266290 4907 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.266388 4907 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.266740 4907 flags.go:64] FLAG: --memory-manager-policy="None" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.266860 4907 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.266958 4907 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.267053 4907 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.267149 4907 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.267278 4907 flags.go:64] FLAG: --node-status-max-images="50" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.267388 4907 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.267520 4907 flags.go:64] FLAG: --oom-score-adj="-999" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.267621 4907 flags.go:64] FLAG: --pod-cidr="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.267716 4907 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.267817 4907 flags.go:64] FLAG: --pod-manifest-path="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.267917 4907 flags.go:64] FLAG: --pod-max-pids="-1" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.268047 4907 flags.go:64] FLAG: --pods-per-core="0" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.268159 4907 flags.go:64] FLAG: --port="10250" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.268256 4907 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.268351 4907 flags.go:64] FLAG: --provider-id="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.268474 4907 flags.go:64] FLAG: --qos-reserved="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.268577 4907 flags.go:64] FLAG: --read-only-port="10255" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.268672 4907 flags.go:64] FLAG: --register-node="true" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.268784 4907 flags.go:64] FLAG: --register-schedulable="true" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.268883 4907 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.268987 4907 flags.go:64] FLAG: --registry-burst="10" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.269082 4907 flags.go:64] FLAG: --registry-qps="5" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.269176 4907 flags.go:64] FLAG: --reserved-cpus="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.269287 4907 flags.go:64] FLAG: --reserved-memory="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.269389 4907 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.269580 4907 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.269685 4907 flags.go:64] FLAG: --rotate-certificates="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.269781 4907 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.269876 4907 flags.go:64] FLAG: --runonce="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.269980 4907 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.270077 4907 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.270172 4907 flags.go:64] FLAG: --seccomp-default="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.270274 4907 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.270370 4907 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.270497 4907 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.270611 4907 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.270734 4907 flags.go:64] FLAG: --storage-driver-password="root" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.270831 4907 flags.go:64] FLAG: --storage-driver-secure="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.270924 4907 flags.go:64] FLAG: --storage-driver-table="stats" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.271018 4907 flags.go:64] FLAG: --storage-driver-user="root" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.271121 4907 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.271220 4907 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.271315 4907 flags.go:64] FLAG: --system-cgroups="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.271408 4907 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.271545 4907 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.271643 4907 flags.go:64] FLAG: --tls-cert-file="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.271736 4907 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.271866 4907 flags.go:64] FLAG: --tls-min-version="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.271964 4907 flags.go:64] FLAG: --tls-private-key-file="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.272058 4907 flags.go:64] FLAG: --topology-manager-policy="none" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.272153 4907 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.272247 4907 flags.go:64] FLAG: --topology-manager-scope="container" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.272341 4907 flags.go:64] FLAG: --v="2" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.272478 4907 flags.go:64] FLAG: --version="false" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.272600 4907 flags.go:64] FLAG: --vmodule="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.272701 4907 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.272796 4907 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.273130 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.273235 4907 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.273343 4907 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.273444 4907 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.273674 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.273775 4907 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.273879 4907 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.273973 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.274065 4907 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.274156 4907 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.274245 4907 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.274344 4907 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.274444 4907 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.274566 4907 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.274660 4907 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.274826 4907 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.274923 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275014 4907 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275121 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275214 4907 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275304 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275393 4907 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275578 4907 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275631 4907 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275644 4907 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275655 4907 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275666 4907 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275676 4907 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275685 4907 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275696 4907 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275706 4907 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275714 4907 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275723 4907 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275731 4907 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275739 4907 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275746 4907 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275757 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275765 4907 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275792 4907 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275800 4907 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275808 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275816 4907 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275824 4907 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275832 4907 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275841 4907 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275849 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275857 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275865 4907 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275874 4907 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275882 4907 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275890 4907 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275897 4907 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275905 4907 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275916 4907 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275926 4907 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275935 4907 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275943 4907 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275951 4907 feature_gate.go:330] unrecognized feature gate: Example Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275961 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275969 4907 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275977 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275989 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.275997 4907 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.276006 4907 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.276014 4907 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.276022 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.276031 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.276040 4907 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.276048 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.276056 4907 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.276064 4907 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.276350 4907 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.286963 4907 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.287008 4907 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287145 4907 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287157 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287166 4907 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287175 4907 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287183 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287191 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287198 4907 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287210 4907 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287223 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287232 4907 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287241 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287250 4907 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287259 4907 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287268 4907 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287275 4907 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287284 4907 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287291 4907 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287299 4907 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287309 4907 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287320 4907 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287328 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287337 4907 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287345 4907 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287353 4907 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287361 4907 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287371 4907 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287381 4907 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287391 4907 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287401 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287409 4907 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287416 4907 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287424 4907 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287432 4907 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287439 4907 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287476 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287484 4907 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287492 4907 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287500 4907 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287507 4907 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287516 4907 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287524 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287532 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287539 4907 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287547 4907 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287555 4907 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287563 4907 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287570 4907 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287579 4907 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287587 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287594 4907 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287602 4907 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287610 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287618 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287625 4907 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287633 4907 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287641 4907 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287648 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287656 4907 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287663 4907 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287671 4907 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287679 4907 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287686 4907 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287693 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287701 4907 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287708 4907 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287716 4907 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287724 4907 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287731 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287739 4907 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287747 4907 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287762 4907 feature_gate.go:330] unrecognized feature gate: Example Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.287776 4907 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.287995 4907 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288006 4907 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288017 4907 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288027 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288036 4907 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288044 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288051 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288059 4907 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288067 4907 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288076 4907 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288084 4907 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288091 4907 feature_gate.go:330] unrecognized feature gate: Example Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288099 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288106 4907 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288114 4907 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288122 4907 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288129 4907 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288137 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288147 4907 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288157 4907 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288166 4907 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288174 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288182 4907 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288190 4907 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288198 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288206 4907 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288214 4907 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288222 4907 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288233 4907 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288243 4907 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288253 4907 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288262 4907 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288271 4907 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288279 4907 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288289 4907 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288298 4907 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288306 4907 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288316 4907 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288326 4907 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288335 4907 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288344 4907 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288352 4907 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288360 4907 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288368 4907 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288375 4907 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288384 4907 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288392 4907 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288399 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288406 4907 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288414 4907 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288422 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288429 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288442 4907 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288472 4907 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288480 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288488 4907 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288496 4907 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288504 4907 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288511 4907 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288518 4907 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288526 4907 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288534 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288542 4907 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288550 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288559 4907 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288566 4907 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288574 4907 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288582 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288590 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288597 4907 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.288606 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.288619 4907 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.288860 4907 server.go:940] "Client rotation is on, will bootstrap in background" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.293353 4907 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.293515 4907 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.294343 4907 server.go:997] "Starting client certificate rotation" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.294381 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.294540 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-27 20:42:58.963444084 +0000 UTC Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.294639 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.305314 4907 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 29 14:28:22 crc kubenswrapper[4907]: E1129 14:28:22.307237 4907 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.309479 4907 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.321477 4907 log.go:25] "Validated CRI v1 runtime API" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.348070 4907 log.go:25] "Validated CRI v1 image API" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.350072 4907 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.352783 4907 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-29-14-23-53-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.352845 4907 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.378170 4907 manager.go:217] Machine: {Timestamp:2025-11-29 14:28:22.376089526 +0000 UTC m=+0.362927258 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:aa1144e5-f0f0-4c33-8960-c154529ab598 BootID:78619aac-2f63-40e5-809f-f2f742346ccf Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:35:5d:1f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:35:5d:1f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f6:61:ae Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:4c:b5:23 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ee:f7:73 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:75:f6:dd Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3a:f8:9e:d8:b3:4a Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:de:f6:01:69:59:b8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.378593 4907 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.378812 4907 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.379547 4907 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.379826 4907 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.379883 4907 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.380208 4907 topology_manager.go:138] "Creating topology manager with none policy" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.380225 4907 container_manager_linux.go:303] "Creating device plugin manager" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.380598 4907 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.380660 4907 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.381048 4907 state_mem.go:36] "Initialized new in-memory state store" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.381183 4907 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.383744 4907 kubelet.go:418] "Attempting to sync node with API server" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.383775 4907 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.383811 4907 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.383831 4907 kubelet.go:324] "Adding apiserver pod source" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.383850 4907 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.385963 4907 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.386567 4907 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.387314 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.387331 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Nov 29 14:28:22 crc kubenswrapper[4907]: E1129 14:28:22.387428 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Nov 29 14:28:22 crc kubenswrapper[4907]: E1129 14:28:22.387433 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.388263 4907 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.388950 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.388984 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.388993 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.389001 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.389015 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.389025 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.389034 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.389055 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.389064 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.389073 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.389105 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.389115 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.389353 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.389783 4907 server.go:1280] "Started kubelet" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.390407 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.391237 4907 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.391412 4907 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 29 14:28:22 crc systemd[1]: Started Kubernetes Kubelet. Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.394811 4907 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.395067 4907 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.392388 4907 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.398547 4907 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 09:04:28.018171909 +0000 UTC Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.399792 4907 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 186h36m5.618397284s for next certificate rotation Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.398952 4907 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 29 14:28:22 crc kubenswrapper[4907]: E1129 14:28:22.399035 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.398922 4907 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.399889 4907 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.399971 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Nov 29 14:28:22 crc kubenswrapper[4907]: E1129 14:28:22.400013 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="200ms" Nov 29 14:28:22 crc kubenswrapper[4907]: E1129 14:28:22.400091 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Nov 29 14:28:22 crc kubenswrapper[4907]: E1129 14:28:22.399308 4907 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.47:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187c808e3b39ed22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-29 14:28:22.38975517 +0000 UTC m=+0.376592832,LastTimestamp:2025-11-29 14:28:22.38975517 +0000 UTC m=+0.376592832,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.406031 4907 factory.go:55] Registering systemd factory Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.406071 4907 factory.go:221] Registration of the systemd container factory successfully Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.407426 4907 factory.go:153] Registering CRI-O factory Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.407516 4907 factory.go:221] Registration of the crio container factory successfully Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.407662 4907 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.407710 4907 factory.go:103] Registering Raw factory Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.407742 4907 manager.go:1196] Started watching for new ooms in manager Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.408426 4907 server.go:460] "Adding debug handlers to kubelet server" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.409246 4907 manager.go:319] Starting recovery of all containers Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417133 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417223 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417247 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417268 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417287 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417307 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417327 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417346 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417367 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417385 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417404 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417422 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417440 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417510 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417533 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417553 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417579 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417597 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417616 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417635 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417651 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417707 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417725 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417744 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417764 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417782 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417808 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417829 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417847 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417864 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417883 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417901 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417918 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417936 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417957 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417976 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.417993 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418010 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418030 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418051 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418099 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418118 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418135 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418153 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418170 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418188 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418206 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418224 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418241 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418259 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418278 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418297 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418322 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418341 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418361 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418379 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418398 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418420 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418444 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418493 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418517 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418534 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418553 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418572 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418590 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418609 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418627 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418643 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418660 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418678 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418695 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418714 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418730 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418747 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418766 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418783 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418800 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418819 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418837 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418857 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418909 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418929 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418948 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418974 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.418995 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419014 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419033 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419064 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419084 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419102 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419121 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419138 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419156 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419172 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419189 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419214 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419231 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419249 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419265 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419282 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419299 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419316 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419333 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419351 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419375 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419393 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419413 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419445 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419499 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.419528 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.424381 4907 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.425773 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.425833 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.425860 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.425881 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.425900 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.425923 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.425942 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.425961 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.425978 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.425996 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426014 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426032 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426051 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426072 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426091 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426135 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426154 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426171 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426198 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426216 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426235 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426254 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426273 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426291 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426309 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426327 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426348 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426366 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426385 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426402 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426420 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426540 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426564 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426584 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426601 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.426620 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427262 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427312 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427332 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427366 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427386 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427414 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427433 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427495 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427713 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427744 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427779 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427808 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427836 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427873 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427899 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427934 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427963 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.427990 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.428026 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.428053 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.428089 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.428115 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.428141 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.428176 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.428202 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434074 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434265 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434314 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434394 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434420 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434447 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434507 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434528 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434570 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434591 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434614 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434639 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434683 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434709 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434731 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434764 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434788 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434808 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434834 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434855 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434875 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434900 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434919 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434947 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.434974 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.435003 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.435043 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.435067 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.435104 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.435134 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.435155 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.435181 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.435202 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.435225 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.435246 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.435265 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.435290 4907 reconstruct.go:97] "Volume reconstruction finished" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.435317 4907 reconciler.go:26] "Reconciler: start to sync state" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.443989 4907 manager.go:324] Recovery completed Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.454668 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.456070 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.456107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.456119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.456990 4907 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.457009 4907 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.457029 4907 state_mem.go:36] "Initialized new in-memory state store" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.467425 4907 policy_none.go:49] "None policy: Start" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.468211 4907 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.468241 4907 state_mem.go:35] "Initializing new in-memory state store" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.474523 4907 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.478140 4907 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.478225 4907 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.478273 4907 kubelet.go:2335] "Starting kubelet main sync loop" Nov 29 14:28:22 crc kubenswrapper[4907]: E1129 14:28:22.478363 4907 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.479449 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Nov 29 14:28:22 crc kubenswrapper[4907]: E1129 14:28:22.479578 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Nov 29 14:28:22 crc kubenswrapper[4907]: E1129 14:28:22.500126 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.541110 4907 manager.go:334] "Starting Device Plugin manager" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.541288 4907 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.541309 4907 server.go:79] "Starting device plugin registration server" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.541897 4907 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.541974 4907 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.542217 4907 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.542432 4907 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.542483 4907 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 29 14:28:22 crc kubenswrapper[4907]: E1129 14:28:22.556262 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.578853 4907 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.579014 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.580701 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.580771 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.580795 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.581013 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.581665 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.581765 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.582328 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.582413 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.582548 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.582860 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.582921 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.582964 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.582896 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.583031 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.583043 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.584286 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.584379 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.584407 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.584381 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.584485 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.584506 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.584642 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.584886 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.584948 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.585549 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.585608 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.585635 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.585917 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.585949 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.585983 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.586000 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.586177 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.586228 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.587134 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.587182 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.587201 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.587335 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.587379 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.587404 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.587785 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.587851 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.588997 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.589192 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.589216 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:22 crc kubenswrapper[4907]: E1129 14:28:22.601397 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="400ms" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.637223 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.637304 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.637346 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.637384 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.637420 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.637488 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.637520 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.637555 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.637589 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.637619 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.637679 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.637726 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.637760 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.637798 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.637833 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.642491 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.644003 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.644049 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.644128 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.644160 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 14:28:22 crc kubenswrapper[4907]: E1129 14:28:22.644909 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.47:6443: connect: connection refused" node="crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739433 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739531 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739566 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739597 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739633 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739665 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739697 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739748 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739778 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739812 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739839 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739842 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739942 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.740001 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.740006 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739764 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739866 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.740086 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.740093 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.740125 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.740133 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.740042 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.739947 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.740180 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.740183 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.740203 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.740233 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.740275 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.740434 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.740661 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.846080 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.855102 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.855733 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.855759 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.855803 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 14:28:22 crc kubenswrapper[4907]: E1129 14:28:22.856476 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.47:6443: connect: connection refused" node="crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.925710 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.954558 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.962129 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-1fd5828a7d750e40ec09dfd9be6462c2f007fac4892e122c8b1c5ac47358d969 WatchSource:0}: Error finding container 1fd5828a7d750e40ec09dfd9be6462c2f007fac4892e122c8b1c5ac47358d969: Status 404 returned error can't find the container with id 1fd5828a7d750e40ec09dfd9be6462c2f007fac4892e122c8b1c5ac47358d969 Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.968739 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.988838 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-7928ac3229aaee929fb70f13b7f1a4a4919515c7b377ad1505df652964739fe8 WatchSource:0}: Error finding container 7928ac3229aaee929fb70f13b7f1a4a4919515c7b377ad1505df652964739fe8: Status 404 returned error can't find the container with id 7928ac3229aaee929fb70f13b7f1a4a4919515c7b377ad1505df652964739fe8 Nov 29 14:28:22 crc kubenswrapper[4907]: I1129 14:28:22.990523 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 29 14:28:22 crc kubenswrapper[4907]: W1129 14:28:22.995887 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-25c75eb6e066d24ef1d0914d536c30795bdadc366af4b81323378b83e5e38762 WatchSource:0}: Error finding container 25c75eb6e066d24ef1d0914d536c30795bdadc366af4b81323378b83e5e38762: Status 404 returned error can't find the container with id 25c75eb6e066d24ef1d0914d536c30795bdadc366af4b81323378b83e5e38762 Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.001516 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:23 crc kubenswrapper[4907]: E1129 14:28:23.002187 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="800ms" Nov 29 14:28:23 crc kubenswrapper[4907]: W1129 14:28:23.012045 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-5d070bf673ca6701edcdbe8743e7ed73c0a4ed7e9ac79b197ae87f617dff94dd WatchSource:0}: Error finding container 5d070bf673ca6701edcdbe8743e7ed73c0a4ed7e9ac79b197ae87f617dff94dd: Status 404 returned error can't find the container with id 5d070bf673ca6701edcdbe8743e7ed73c0a4ed7e9ac79b197ae87f617dff94dd Nov 29 14:28:23 crc kubenswrapper[4907]: W1129 14:28:23.034846 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-d184ad95186ad3c73beb022767a7f0a9e53faca1a5124c73ef7b92e1404d01ed WatchSource:0}: Error finding container d184ad95186ad3c73beb022767a7f0a9e53faca1a5124c73ef7b92e1404d01ed: Status 404 returned error can't find the container with id d184ad95186ad3c73beb022767a7f0a9e53faca1a5124c73ef7b92e1404d01ed Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.256798 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.258558 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.258596 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.258608 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.258639 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 14:28:23 crc kubenswrapper[4907]: E1129 14:28:23.258953 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.47:6443: connect: connection refused" node="crc" Nov 29 14:28:23 crc kubenswrapper[4907]: W1129 14:28:23.347139 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Nov 29 14:28:23 crc kubenswrapper[4907]: E1129 14:28:23.347205 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.391867 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.484741 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027" exitCode=0 Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.484813 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027"} Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.484886 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d184ad95186ad3c73beb022767a7f0a9e53faca1a5124c73ef7b92e1404d01ed"} Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.484969 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.485895 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.485924 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.485934 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.486523 4907 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020" exitCode=0 Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.486564 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020"} Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.486579 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5d070bf673ca6701edcdbe8743e7ed73c0a4ed7e9ac79b197ae87f617dff94dd"} Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.486640 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.487403 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.487419 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.487426 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.487720 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.488595 4907 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="22fe79452df710a2033161adf60fa9f0e00772f062db76a4f55e26aef9880345" exitCode=0 Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.488655 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"22fe79452df710a2033161adf60fa9f0e00772f062db76a4f55e26aef9880345"} Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.488675 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"25c75eb6e066d24ef1d0914d536c30795bdadc366af4b81323378b83e5e38762"} Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.488643 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.488725 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.488817 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.488839 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.489537 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.489558 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.489567 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.495296 4907 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613" exitCode=0 Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.495377 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613"} Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.495405 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7928ac3229aaee929fb70f13b7f1a4a4919515c7b377ad1505df652964739fe8"} Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.495515 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.496337 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.496387 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.496406 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.496861 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192"} Nov 29 14:28:23 crc kubenswrapper[4907]: I1129 14:28:23.496889 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1fd5828a7d750e40ec09dfd9be6462c2f007fac4892e122c8b1c5ac47358d969"} Nov 29 14:28:23 crc kubenswrapper[4907]: E1129 14:28:23.557049 4907 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.47:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187c808e3b39ed22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-29 14:28:22.38975517 +0000 UTC m=+0.376592832,LastTimestamp:2025-11-29 14:28:22.38975517 +0000 UTC m=+0.376592832,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 29 14:28:23 crc kubenswrapper[4907]: W1129 14:28:23.611093 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Nov 29 14:28:23 crc kubenswrapper[4907]: E1129 14:28:23.611180 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Nov 29 14:28:23 crc kubenswrapper[4907]: W1129 14:28:23.622150 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Nov 29 14:28:23 crc kubenswrapper[4907]: E1129 14:28:23.622193 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Nov 29 14:28:23 crc kubenswrapper[4907]: E1129 14:28:23.803100 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="1.6s" Nov 29 14:28:23 crc kubenswrapper[4907]: W1129 14:28:23.889032 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.47:6443: connect: connection refused Nov 29 14:28:23 crc kubenswrapper[4907]: E1129 14:28:23.889108 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.47:6443: connect: connection refused" logger="UnhandledError" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.059289 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.063512 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.063540 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.063549 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.063573 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 14:28:24 crc kubenswrapper[4907]: E1129 14:28:24.063801 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.47:6443: connect: connection refused" node="crc" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.358709 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.500973 4907 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df" exitCode=0 Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.501042 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df"} Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.501193 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.501923 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.501952 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.501964 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.504510 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c722630ed7d79458dc8b77d6193c617baa5e6778268c59b056a310447612d3b8"} Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.504575 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.505424 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.505468 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.505480 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.510146 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"afd8e9e3c38d0d0710dd8297cc120bf4ec2bf18f297b0dc850513d2096377636"} Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.510177 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1818b34bee237f8b9788cae86c3541ecb29f693da7f3008bda027c4fe45618db"} Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.510192 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2c842d45e7b04ef536026a952134478e9f8aba8dc779b6bc127d2fc89063af4e"} Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.510263 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.511135 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.511161 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.511171 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.513287 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2"} Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.513324 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f"} Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.513337 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a"} Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.513372 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.514118 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.514149 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.514158 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.516084 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f"} Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.516112 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f"} Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.516127 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a"} Nov 29 14:28:24 crc kubenswrapper[4907]: I1129 14:28:24.516140 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36"} Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.524012 4907 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca" exitCode=0 Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.524110 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca"} Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.524337 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.525844 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.525884 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.525901 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.531104 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.531898 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.532417 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc"} Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.533039 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.533072 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.533119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.533858 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.533894 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.533906 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.664857 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.667031 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.667064 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.667077 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:25 crc kubenswrapper[4907]: I1129 14:28:25.667101 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 14:28:26 crc kubenswrapper[4907]: I1129 14:28:26.538399 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397"} Nov 29 14:28:26 crc kubenswrapper[4907]: I1129 14:28:26.538898 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4"} Nov 29 14:28:26 crc kubenswrapper[4907]: I1129 14:28:26.538932 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1"} Nov 29 14:28:26 crc kubenswrapper[4907]: I1129 14:28:26.538549 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 14:28:26 crc kubenswrapper[4907]: I1129 14:28:26.539028 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:26 crc kubenswrapper[4907]: I1129 14:28:26.540283 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:26 crc kubenswrapper[4907]: I1129 14:28:26.540334 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:26 crc kubenswrapper[4907]: I1129 14:28:26.540351 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:26 crc kubenswrapper[4907]: I1129 14:28:26.931765 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:26 crc kubenswrapper[4907]: I1129 14:28:26.931986 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:26 crc kubenswrapper[4907]: I1129 14:28:26.933585 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:26 crc kubenswrapper[4907]: I1129 14:28:26.933647 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:26 crc kubenswrapper[4907]: I1129 14:28:26.933670 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:27 crc kubenswrapper[4907]: I1129 14:28:27.545490 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5"} Nov 29 14:28:27 crc kubenswrapper[4907]: I1129 14:28:27.546148 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098"} Nov 29 14:28:27 crc kubenswrapper[4907]: I1129 14:28:27.546418 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:27 crc kubenswrapper[4907]: I1129 14:28:27.547906 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:27 crc kubenswrapper[4907]: I1129 14:28:27.547954 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:27 crc kubenswrapper[4907]: I1129 14:28:27.547971 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:27 crc kubenswrapper[4907]: I1129 14:28:27.596717 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 29 14:28:27 crc kubenswrapper[4907]: I1129 14:28:27.891902 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:27 crc kubenswrapper[4907]: I1129 14:28:27.892035 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 14:28:27 crc kubenswrapper[4907]: I1129 14:28:27.892081 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:27 crc kubenswrapper[4907]: I1129 14:28:27.893695 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:27 crc kubenswrapper[4907]: I1129 14:28:27.893742 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:27 crc kubenswrapper[4907]: I1129 14:28:27.893757 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:28 crc kubenswrapper[4907]: I1129 14:28:28.549516 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:28 crc kubenswrapper[4907]: I1129 14:28:28.551827 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:28 crc kubenswrapper[4907]: I1129 14:28:28.551911 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:28 crc kubenswrapper[4907]: I1129 14:28:28.551930 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.552422 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.553888 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.553971 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.553996 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.565131 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.565292 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.565351 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.566310 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.566355 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.566372 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.668709 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.668934 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.670514 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.670605 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.670665 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.932092 4907 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 29 14:28:29 crc kubenswrapper[4907]: I1129 14:28:29.932190 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 14:28:30 crc kubenswrapper[4907]: I1129 14:28:30.080234 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:30 crc kubenswrapper[4907]: I1129 14:28:30.087481 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:30 crc kubenswrapper[4907]: I1129 14:28:30.555563 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:30 crc kubenswrapper[4907]: I1129 14:28:30.557150 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:30 crc kubenswrapper[4907]: I1129 14:28:30.557212 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:30 crc kubenswrapper[4907]: I1129 14:28:30.557231 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:30 crc kubenswrapper[4907]: I1129 14:28:30.702072 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:30 crc kubenswrapper[4907]: I1129 14:28:30.702259 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:30 crc kubenswrapper[4907]: I1129 14:28:30.703658 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:30 crc kubenswrapper[4907]: I1129 14:28:30.703709 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:30 crc kubenswrapper[4907]: I1129 14:28:30.703726 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:31 crc kubenswrapper[4907]: I1129 14:28:31.560499 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 14:28:31 crc kubenswrapper[4907]: I1129 14:28:31.560603 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:31 crc kubenswrapper[4907]: I1129 14:28:31.562293 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:31 crc kubenswrapper[4907]: I1129 14:28:31.562360 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:31 crc kubenswrapper[4907]: I1129 14:28:31.562381 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:31 crc kubenswrapper[4907]: I1129 14:28:31.884626 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:32 crc kubenswrapper[4907]: E1129 14:28:32.556386 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 29 14:28:32 crc kubenswrapper[4907]: I1129 14:28:32.563061 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:32 crc kubenswrapper[4907]: I1129 14:28:32.564293 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:32 crc kubenswrapper[4907]: I1129 14:28:32.564348 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:32 crc kubenswrapper[4907]: I1129 14:28:32.564368 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:34 crc kubenswrapper[4907]: I1129 14:28:34.337766 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 14:28:34 crc kubenswrapper[4907]: I1129 14:28:34.337979 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:34 crc kubenswrapper[4907]: I1129 14:28:34.339366 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:34 crc kubenswrapper[4907]: I1129 14:28:34.339472 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:34 crc kubenswrapper[4907]: I1129 14:28:34.339496 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:34 crc kubenswrapper[4907]: E1129 14:28:34.360380 4907 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 29 14:28:34 crc kubenswrapper[4907]: I1129 14:28:34.391352 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 29 14:28:34 crc kubenswrapper[4907]: I1129 14:28:34.803185 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 29 14:28:34 crc kubenswrapper[4907]: I1129 14:28:34.803473 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:34 crc kubenswrapper[4907]: I1129 14:28:34.804979 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:34 crc kubenswrapper[4907]: I1129 14:28:34.805096 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:34 crc kubenswrapper[4907]: I1129 14:28:34.805178 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:35 crc kubenswrapper[4907]: W1129 14:28:35.284840 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 29 14:28:35 crc kubenswrapper[4907]: I1129 14:28:35.285115 4907 trace.go:236] Trace[206816651]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Nov-2025 14:28:25.283) (total time: 10001ms): Nov 29 14:28:35 crc kubenswrapper[4907]: Trace[206816651]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:28:35.284) Nov 29 14:28:35 crc kubenswrapper[4907]: Trace[206816651]: [10.001450748s] [10.001450748s] END Nov 29 14:28:35 crc kubenswrapper[4907]: E1129 14:28:35.285292 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 29 14:28:35 crc kubenswrapper[4907]: E1129 14:28:35.404143 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Nov 29 14:28:35 crc kubenswrapper[4907]: I1129 14:28:35.590601 4907 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 29 14:28:35 crc kubenswrapper[4907]: I1129 14:28:35.590677 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 29 14:28:35 crc kubenswrapper[4907]: I1129 14:28:35.602579 4907 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 29 14:28:35 crc kubenswrapper[4907]: I1129 14:28:35.602652 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 29 14:28:37 crc kubenswrapper[4907]: I1129 14:28:37.901997 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:37 crc kubenswrapper[4907]: I1129 14:28:37.902365 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:37 crc kubenswrapper[4907]: I1129 14:28:37.904263 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:37 crc kubenswrapper[4907]: I1129 14:28:37.904336 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:37 crc kubenswrapper[4907]: I1129 14:28:37.904362 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:37 crc kubenswrapper[4907]: I1129 14:28:37.910053 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:38 crc kubenswrapper[4907]: I1129 14:28:38.578779 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:38 crc kubenswrapper[4907]: I1129 14:28:38.580186 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:38 crc kubenswrapper[4907]: I1129 14:28:38.580321 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:38 crc kubenswrapper[4907]: I1129 14:28:38.580351 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:38 crc kubenswrapper[4907]: I1129 14:28:38.742353 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 29 14:28:38 crc kubenswrapper[4907]: I1129 14:28:38.763811 4907 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Nov 29 14:28:39 crc kubenswrapper[4907]: I1129 14:28:39.083855 4907 csr.go:261] certificate signing request csr-l629h is approved, waiting to be issued Nov 29 14:28:39 crc kubenswrapper[4907]: I1129 14:28:39.117684 4907 csr.go:257] certificate signing request csr-l629h is issued Nov 29 14:28:39 crc kubenswrapper[4907]: I1129 14:28:39.933413 4907 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 29 14:28:39 crc kubenswrapper[4907]: I1129 14:28:39.933702 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.118803 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-11-29 14:23:39 +0000 UTC, rotation deadline is 2026-10-12 12:20:55.06917657 +0000 UTC Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.118946 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7605h52m14.950239401s for next certificate rotation Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.590297 4907 trace.go:236] Trace[262754730]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Nov-2025 14:28:26.704) (total time: 13886ms): Nov 29 14:28:40 crc kubenswrapper[4907]: Trace[262754730]: ---"Objects listed" error: 13886ms (14:28:40.590) Nov 29 14:28:40 crc kubenswrapper[4907]: Trace[262754730]: [13.886170026s] [13.886170026s] END Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.590722 4907 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.591886 4907 trace.go:236] Trace[335819672]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Nov-2025 14:28:26.614) (total time: 13977ms): Nov 29 14:28:40 crc kubenswrapper[4907]: Trace[335819672]: ---"Objects listed" error: 13976ms (14:28:40.591) Nov 29 14:28:40 crc kubenswrapper[4907]: Trace[335819672]: [13.977011142s] [13.977011142s] END Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.592133 4907 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 29 14:28:40 crc kubenswrapper[4907]: E1129 14:28:40.592489 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.592743 4907 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.593287 4907 trace.go:236] Trace[861238231]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Nov-2025 14:28:25.983) (total time: 14609ms): Nov 29 14:28:40 crc kubenswrapper[4907]: Trace[861238231]: ---"Objects listed" error: 14609ms (14:28:40.593) Nov 29 14:28:40 crc kubenswrapper[4907]: Trace[861238231]: [14.609333322s] [14.609333322s] END Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.593328 4907 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.712529 4907 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.712609 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.712992 4907 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.714307 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.714044 4907 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.714385 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.714697 4907 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 29 14:28:40 crc kubenswrapper[4907]: I1129 14:28:40.714766 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.400580 4907 apiserver.go:52] "Watching apiserver" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.403231 4907 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.403631 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-c92rh","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.404066 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.404184 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.404302 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.404373 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c92rh" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.404400 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.404412 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.404473 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.404506 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.404557 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.404600 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.408772 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.408952 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.409133 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.409343 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.409368 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.409544 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.409577 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.409862 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.410088 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.410162 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.410191 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.410717 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.424330 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.437654 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.452332 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.463208 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.472251 4907 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.477394 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.489682 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.501046 4907 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.509886 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.518258 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.556468 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-t4jq9"] Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.556917 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.558725 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.558774 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.559789 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.560091 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.560267 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.573803 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.581964 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.587073 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.589670 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc" exitCode=255 Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.589715 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc"} Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.591962 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597643 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597689 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597707 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597723 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597738 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597754 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597773 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597790 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597808 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597824 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597841 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597860 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597876 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597891 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597907 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597923 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597944 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597963 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597981 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.597996 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598011 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598029 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598051 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598068 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598084 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598099 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598116 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598131 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598150 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598166 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598184 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598203 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598220 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598234 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598264 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598281 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598296 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598310 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598324 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598339 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598354 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598373 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598391 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598405 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598426 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598461 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598475 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598490 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598506 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598521 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598537 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598552 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598566 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598581 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598597 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598623 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598639 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598663 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598745 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598764 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598780 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598796 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598811 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598828 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598844 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598863 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598879 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598895 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598911 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598927 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598945 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598960 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598975 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598992 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599007 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599024 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599039 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599068 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599084 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599101 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599124 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599144 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599162 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599178 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599193 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599209 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599224 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599239 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599257 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599276 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599293 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599310 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599326 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599342 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599359 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599375 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599392 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599412 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599428 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599462 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599477 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599494 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599511 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599528 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599544 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599561 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599577 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599593 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599609 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599626 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599643 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599658 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599674 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599693 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599709 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599729 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599745 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599762 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599777 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599792 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599810 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599826 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599841 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599856 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599875 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599892 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599908 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599923 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599941 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599959 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599976 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599992 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600008 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600026 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600042 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600058 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600074 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600093 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600115 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600131 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600148 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600163 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600182 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600202 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600222 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600240 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600257 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600273 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600289 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600304 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600322 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600338 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600355 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600372 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600389 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600404 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600421 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600458 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600474 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600490 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600507 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600525 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600543 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600559 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600577 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600594 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600610 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600628 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600644 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600660 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600681 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600698 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600715 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600732 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600748 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600765 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600781 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600801 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600819 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600836 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600854 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600871 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600887 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600904 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600921 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600937 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600957 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600974 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600991 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601007 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601024 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601041 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601060 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601082 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601100 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601131 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601173 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601195 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601213 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601234 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601254 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601273 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601298 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601318 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601335 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601353 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0ff04d25-6931-42f8-af97-0f231dfb8d55-hosts-file\") pod \"node-resolver-c92rh\" (UID: \"0ff04d25-6931-42f8-af97-0f231dfb8d55\") " pod="openshift-dns/node-resolver-c92rh" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601371 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601389 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601406 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hztt\" (UniqueName: \"kubernetes.io/projected/0ff04d25-6931-42f8-af97-0f231dfb8d55-kube-api-access-2hztt\") pod \"node-resolver-c92rh\" (UID: \"0ff04d25-6931-42f8-af97-0f231dfb8d55\") " pod="openshift-dns/node-resolver-c92rh" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601427 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601475 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601493 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601523 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.613197 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.614028 4907 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.619612 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.620223 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.620541 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.628727 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.629847 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.598876 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599043 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.599337 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600061 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600341 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600555 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600777 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.600997 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601043 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601237 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601616 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.601622 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.639142 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.639147 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.639172 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.639252 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601816 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.601900 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.639335 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.602162 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.602460 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.602620 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.639386 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.602754 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.602860 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.602979 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.639562 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.640472 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.640628 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.640767 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.640782 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.603021 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.603054 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.603141 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.603310 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.603366 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.603572 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.603621 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.603673 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.603729 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.604026 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.604206 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.604256 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.604069 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.604334 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.604511 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.605216 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.605306 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.605728 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.605745 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.605796 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.606140 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.606227 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.606241 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.606363 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.606507 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.606546 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.606636 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.606802 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.606860 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.607116 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.607040 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.607310 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.607328 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.607511 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.607591 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.607587 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.607687 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.607945 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.608219 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.608627 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.608729 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.609172 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.609573 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.609638 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.609672 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.609882 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.610018 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.610302 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.610373 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.610629 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.611226 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.611242 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.611700 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.642423 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.611772 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.612025 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.612375 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.612418 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.612510 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.612838 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.612884 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.612899 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.613395 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.613520 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.613696 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.613872 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.614368 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.614395 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.614784 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.614876 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.615093 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.615138 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.615009 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.615554 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.615541 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.615561 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.615643 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.616256 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.616360 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.616391 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.616808 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.616718 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.616856 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.618774 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.619031 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.619329 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.620693 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.620986 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.621344 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.621538 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.621672 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.621773 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.622071 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.622104 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.622148 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.628061 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.628491 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.643004 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.643022 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.628735 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.628894 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.630058 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.633607 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.634082 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.634168 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.634600 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.634623 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.634674 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.634725 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.634739 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.635606 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.635858 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.636166 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.636454 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.636925 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.637572 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.637689 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.637781 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.638128 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:28:42.138104162 +0000 UTC m=+20.124941814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.638278 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.638378 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.638393 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.643373 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:42.143349005 +0000 UTC m=+20.130186647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.638423 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.638682 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.638808 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.638820 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.638819 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.638833 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.638908 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.643869 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:42.143833419 +0000 UTC m=+20.130671101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.643912 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:42.14389816 +0000 UTC m=+20.130735842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.643951 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.645813 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.648870 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.648906 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.648924 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:41 crc kubenswrapper[4907]: E1129 14:28:41.648979 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:42.148958049 +0000 UTC m=+20.135795711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.649037 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.652084 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.659313 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.659407 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.661554 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.661576 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.662060 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.662157 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.662690 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.663207 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.663254 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.663509 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.664089 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.664217 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.664738 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.664929 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.665182 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.665362 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.665380 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.665839 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.665827 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.665924 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.666164 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.667081 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.667237 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.667777 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.667842 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.667947 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.680791 4907 scope.go:117] "RemoveContainer" containerID="4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.682352 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.682733 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.683365 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.683112 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.684484 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.685042 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.685082 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.685567 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.685699 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.685928 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.686101 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.686252 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.693089 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.693631 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.698846 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.704189 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.704544 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.704585 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/58e4d8d7-8362-41f0-80eb-c07a9219ffbd-mcd-auth-proxy-config\") pod \"machine-config-daemon-t4jq9\" (UID: \"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\") " pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.704609 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0ff04d25-6931-42f8-af97-0f231dfb8d55-hosts-file\") pod \"node-resolver-c92rh\" (UID: \"0ff04d25-6931-42f8-af97-0f231dfb8d55\") " pod="openshift-dns/node-resolver-c92rh" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.704634 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hztt\" (UniqueName: \"kubernetes.io/projected/0ff04d25-6931-42f8-af97-0f231dfb8d55-kube-api-access-2hztt\") pod \"node-resolver-c92rh\" (UID: \"0ff04d25-6931-42f8-af97-0f231dfb8d55\") " pod="openshift-dns/node-resolver-c92rh" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.704658 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg5ff\" (UniqueName: \"kubernetes.io/projected/58e4d8d7-8362-41f0-80eb-c07a9219ffbd-kube-api-access-qg5ff\") pod \"machine-config-daemon-t4jq9\" (UID: \"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\") " pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.704698 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.704710 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58e4d8d7-8362-41f0-80eb-c07a9219ffbd-proxy-tls\") pod \"machine-config-daemon-t4jq9\" (UID: \"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\") " pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.704786 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0ff04d25-6931-42f8-af97-0f231dfb8d55-hosts-file\") pod \"node-resolver-c92rh\" (UID: \"0ff04d25-6931-42f8-af97-0f231dfb8d55\") " pod="openshift-dns/node-resolver-c92rh" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.704894 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705040 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705099 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/58e4d8d7-8362-41f0-80eb-c07a9219ffbd-rootfs\") pod \"machine-config-daemon-t4jq9\" (UID: \"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\") " pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705160 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705180 4907 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705194 4907 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705206 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705218 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705232 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705245 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705257 4907 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705270 4907 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705299 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705311 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705323 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705336 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705348 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705360 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705374 4907 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705385 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705397 4907 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705410 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705422 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705458 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705474 4907 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705485 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705498 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705511 4907 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705523 4907 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705535 4907 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705550 4907 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705562 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705575 4907 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705587 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705599 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705610 4907 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705622 4907 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705634 4907 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705647 4907 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705659 4907 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705671 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705682 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705694 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705706 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705717 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705728 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705740 4907 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705752 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705763 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705775 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705787 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705799 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705811 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705822 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705835 4907 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705846 4907 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705858 4907 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705870 4907 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705881 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705893 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705905 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705916 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705927 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705939 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705951 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705963 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705975 4907 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705988 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.705999 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706012 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706024 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706035 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706046 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706058 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706070 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706081 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706092 4907 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706103 4907 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706114 4907 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706125 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706136 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706148 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706159 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706171 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706183 4907 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706195 4907 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706205 4907 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706217 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706231 4907 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706243 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706255 4907 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706266 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706278 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706290 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706305 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706331 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706342 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706353 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706364 4907 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706376 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706387 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706399 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706410 4907 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706422 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.706434 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709050 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709078 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709091 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709106 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709119 4907 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709132 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709143 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709155 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709167 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709178 4907 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709190 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709202 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709214 4907 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709225 4907 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709238 4907 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709250 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709262 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709274 4907 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709305 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709318 4907 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709330 4907 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709342 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709354 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709364 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709376 4907 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709387 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709399 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709410 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709422 4907 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709433 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709463 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709474 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709487 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709498 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709510 4907 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709520 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709532 4907 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709544 4907 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709555 4907 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709566 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709578 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709591 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709603 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709615 4907 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709628 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709642 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709654 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709666 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709677 4907 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709702 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709714 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709727 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709739 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709751 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709762 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709775 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709787 4907 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709799 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709811 4907 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709823 4907 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709835 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709847 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709859 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709871 4907 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709883 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709894 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709910 4907 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709922 4907 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709933 4907 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709945 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709958 4907 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709970 4907 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709983 4907 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.709994 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.710006 4907 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.710018 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.710029 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.710042 4907 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.710053 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.710064 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.710076 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.710088 4907 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.710100 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.710111 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.710123 4907 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.710135 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.710146 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.710163 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.710174 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.711847 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.714550 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.719868 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.722449 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.724899 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.725563 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hztt\" (UniqueName: \"kubernetes.io/projected/0ff04d25-6931-42f8-af97-0f231dfb8d55-kube-api-access-2hztt\") pod \"node-resolver-c92rh\" (UID: \"0ff04d25-6931-42f8-af97-0f231dfb8d55\") " pod="openshift-dns/node-resolver-c92rh" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.730052 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.731013 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.736640 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c92rh" Nov 29 14:28:41 crc kubenswrapper[4907]: W1129 14:28:41.741863 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-141061e79cd0453a04225ce6545663f1ff8205a33fcfa51e6a7144f0ad196d11 WatchSource:0}: Error finding container 141061e79cd0453a04225ce6545663f1ff8205a33fcfa51e6a7144f0ad196d11: Status 404 returned error can't find the container with id 141061e79cd0453a04225ce6545663f1ff8205a33fcfa51e6a7144f0ad196d11 Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.755467 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.767359 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.775673 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.792964 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.809867 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.812599 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg5ff\" (UniqueName: \"kubernetes.io/projected/58e4d8d7-8362-41f0-80eb-c07a9219ffbd-kube-api-access-qg5ff\") pod \"machine-config-daemon-t4jq9\" (UID: \"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\") " pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.812739 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58e4d8d7-8362-41f0-80eb-c07a9219ffbd-proxy-tls\") pod \"machine-config-daemon-t4jq9\" (UID: \"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\") " pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.812762 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/58e4d8d7-8362-41f0-80eb-c07a9219ffbd-rootfs\") pod \"machine-config-daemon-t4jq9\" (UID: \"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\") " pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.812788 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/58e4d8d7-8362-41f0-80eb-c07a9219ffbd-mcd-auth-proxy-config\") pod \"machine-config-daemon-t4jq9\" (UID: \"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\") " pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.812817 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.812827 4907 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.813140 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/58e4d8d7-8362-41f0-80eb-c07a9219ffbd-rootfs\") pod \"machine-config-daemon-t4jq9\" (UID: \"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\") " pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.813613 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/58e4d8d7-8362-41f0-80eb-c07a9219ffbd-mcd-auth-proxy-config\") pod \"machine-config-daemon-t4jq9\" (UID: \"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\") " pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.818563 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58e4d8d7-8362-41f0-80eb-c07a9219ffbd-proxy-tls\") pod \"machine-config-daemon-t4jq9\" (UID: \"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\") " pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.832255 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg5ff\" (UniqueName: \"kubernetes.io/projected/58e4d8d7-8362-41f0-80eb-c07a9219ffbd-kube-api-access-qg5ff\") pod \"machine-config-daemon-t4jq9\" (UID: \"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\") " pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.871491 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.888353 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.902519 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.909499 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.916692 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.930401 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.936952 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-d5zvb"] Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.937553 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d5zvb" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.938009 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-pngnb"] Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.940354 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.941143 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.941240 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.941344 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.941596 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.942037 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.946200 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.957014 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.963936 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:41 crc kubenswrapper[4907]: I1129 14:28:41.979496 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.002837 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.015531 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.036714 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.076006 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.094853 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115070 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/27b9dc6c-d485-4b7b-94b1-e71337539997-cnibin\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115105 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-run-netns\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115138 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/27b9dc6c-d485-4b7b-94b1-e71337539997-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115158 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-multus-cni-dir\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115190 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27b9dc6c-d485-4b7b-94b1-e71337539997-system-cni-dir\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115214 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-etc-kubernetes\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115235 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj2tj\" (UniqueName: \"kubernetes.io/projected/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-kube-api-access-sj2tj\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115268 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-os-release\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115283 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-multus-socket-dir-parent\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115296 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/27b9dc6c-d485-4b7b-94b1-e71337539997-os-release\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115313 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-multus-conf-dir\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115330 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-multus-daemon-config\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115365 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-run-multus-certs\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115426 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-cni-binary-copy\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115513 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-var-lib-cni-bin\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115533 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-var-lib-cni-multus\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115753 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-cnibin\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115803 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-system-cni-dir\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115849 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-var-lib-kubelet\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115870 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-hostroot\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115890 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/27b9dc6c-d485-4b7b-94b1-e71337539997-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115912 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvvbf\" (UniqueName: \"kubernetes.io/projected/27b9dc6c-d485-4b7b-94b1-e71337539997-kube-api-access-cvvbf\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115928 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-run-k8s-cni-cncf-io\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.115943 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/27b9dc6c-d485-4b7b-94b1-e71337539997-cni-binary-copy\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.124294 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.132775 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.148624 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.163497 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.174800 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.185432 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.195157 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.207076 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217176 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217302 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-multus-conf-dir\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217368 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-multus-conf-dir\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: E1129 14:28:42.217359 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:28:43.217327122 +0000 UTC m=+21.204164914 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217415 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-multus-daemon-config\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217461 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-run-multus-certs\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217483 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-cni-binary-copy\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217513 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217534 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-var-lib-cni-bin\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217553 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-var-lib-cni-multus\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217572 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-cnibin\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217590 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-system-cni-dir\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217609 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217626 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-var-lib-kubelet\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217642 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-hostroot\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217661 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/27b9dc6c-d485-4b7b-94b1-e71337539997-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217679 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvvbf\" (UniqueName: \"kubernetes.io/projected/27b9dc6c-d485-4b7b-94b1-e71337539997-kube-api-access-cvvbf\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217697 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-run-k8s-cni-cncf-io\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217713 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/27b9dc6c-d485-4b7b-94b1-e71337539997-cni-binary-copy\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217731 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/27b9dc6c-d485-4b7b-94b1-e71337539997-cnibin\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217746 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-run-netns\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217771 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/27b9dc6c-d485-4b7b-94b1-e71337539997-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217788 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-multus-cni-dir\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217805 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27b9dc6c-d485-4b7b-94b1-e71337539997-system-cni-dir\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217826 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-etc-kubernetes\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217848 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj2tj\" (UniqueName: \"kubernetes.io/projected/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-kube-api-access-sj2tj\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217868 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217889 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217905 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-os-release\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217927 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-multus-socket-dir-parent\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.217946 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/27b9dc6c-d485-4b7b-94b1-e71337539997-os-release\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218085 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-multus-daemon-config\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218986 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-multus-socket-dir-parent\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218136 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-run-multus-certs\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218347 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/27b9dc6c-d485-4b7b-94b1-e71337539997-os-release\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218499 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/27b9dc6c-d485-4b7b-94b1-e71337539997-cnibin\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218524 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-run-netns\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: E1129 14:28:42.218568 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218647 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-var-lib-kubelet\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218687 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-hostroot\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218701 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-var-lib-cni-bin\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: E1129 14:28:42.218655 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:28:42 crc kubenswrapper[4907]: E1129 14:28:42.219104 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:28:42 crc kubenswrapper[4907]: E1129 14:28:42.219120 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218738 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-var-lib-cni-multus\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: E1129 14:28:42.218831 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:28:42 crc kubenswrapper[4907]: E1129 14:28:42.218919 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:28:42 crc kubenswrapper[4907]: E1129 14:28:42.219187 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218777 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-multus-cni-dir\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218857 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-etc-kubernetes\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218642 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-cni-binary-copy\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218716 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-cnibin\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.219211 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/27b9dc6c-d485-4b7b-94b1-e71337539997-cni-binary-copy\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: E1129 14:28:42.219199 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218961 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-os-release\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218813 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27b9dc6c-d485-4b7b-94b1-e71337539997-system-cni-dir\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218791 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-host-run-k8s-cni-cncf-io\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.218758 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-system-cni-dir\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: E1129 14:28:42.219086 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:43.21907421 +0000 UTC m=+21.205911862 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:28:42 crc kubenswrapper[4907]: E1129 14:28:42.219325 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:43.219307456 +0000 UTC m=+21.206145108 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:42 crc kubenswrapper[4907]: E1129 14:28:42.219342 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:43.219335307 +0000 UTC m=+21.206172959 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:28:42 crc kubenswrapper[4907]: E1129 14:28:42.219359 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:43.219354647 +0000 UTC m=+21.206192299 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.219353 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/27b9dc6c-d485-4b7b-94b1-e71337539997-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.219554 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.220224 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/27b9dc6c-d485-4b7b-94b1-e71337539997-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.231657 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.239331 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj2tj\" (UniqueName: \"kubernetes.io/projected/3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4-kube-api-access-sj2tj\") pod \"multus-d5zvb\" (UID: \"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\") " pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.239335 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvvbf\" (UniqueName: \"kubernetes.io/projected/27b9dc6c-d485-4b7b-94b1-e71337539997-kube-api-access-cvvbf\") pod \"multus-additional-cni-plugins-pngnb\" (UID: \"27b9dc6c-d485-4b7b-94b1-e71337539997\") " pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.240721 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.295238 4907 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.295336 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dtnl8"] Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.295605 4907 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.295617 4907 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.295689 4907 reflector.go:484] object-"openshift-machine-config-operator"/"kube-rbac-proxy": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-rbac-proxy": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.295716 4907 reflector.go:484] object-"openshift-multus"/"multus-daemon-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"multus-daemon-config": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.295725 4907 reflector.go:484] object-"openshift-multus"/"default-cni-sysctl-allowlist": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"default-cni-sysctl-allowlist": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.295751 4907 reflector.go:484] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.295780 4907 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.295810 4907 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.295838 4907 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.295864 4907 reflector.go:484] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.295892 4907 reflector.go:484] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.295934 4907 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.295961 4907 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.295987 4907 reflector.go:484] object-"openshift-multus"/"default-dockercfg-2q5b6": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"default-dockercfg-2q5b6": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.296000 4907 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.296036 4907 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.296068 4907 reflector.go:484] object-"openshift-machine-config-operator"/"proxy-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"proxy-tls": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.296093 4907 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.296122 4907 reflector.go:484] object-"openshift-multus"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.296145 4907 reflector.go:484] object-"openshift-multus"/"cni-copy-resources": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"cni-copy-resources": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.296147 4907 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.296149 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.296178 4907 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.296207 4907 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.296015 4907 reflector.go:484] object-"openshift-multus"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.296259 4907 reflector.go:484] object-"openshift-machine-config-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.304875 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.304972 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.306364 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.306404 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.306427 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.306645 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.306691 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.313934 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319186 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-run-netns\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319215 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-etc-openvswitch\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319236 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5339013-9544-4e7e-a449-c257f1086638-ovn-node-metrics-cert\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319260 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-slash\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319280 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-var-lib-openvswitch\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319320 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-cni-bin\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319341 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-cni-netd\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319362 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-ovnkube-script-lib\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319380 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-openvswitch\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319408 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grsbm\" (UniqueName: \"kubernetes.io/projected/e5339013-9544-4e7e-a449-c257f1086638-kube-api-access-grsbm\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319452 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-systemd-units\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319496 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-log-socket\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319517 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-ovnkube-config\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319580 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-env-overrides\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319597 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-systemd\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319614 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-node-log\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319681 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-ovn\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319700 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-run-ovn-kubernetes\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319718 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-kubelet\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.319738 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.320210 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d5zvb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.328251 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.335422 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bc30bb0_1a1a_48df_af8a_c023bfbfa3f4.slice/crio-ae6d5120293205af9f59874b204cd3c0d245b87b79a857fb59f8f2b35aea367d WatchSource:0}: Error finding container ae6d5120293205af9f59874b204cd3c0d245b87b79a857fb59f8f2b35aea367d: Status 404 returned error can't find the container with id ae6d5120293205af9f59874b204cd3c0d245b87b79a857fb59f8f2b35aea367d Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.340742 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.347037 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pngnb" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.353860 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: W1129 14:28:42.359933 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27b9dc6c_d485_4b7b_94b1_e71337539997.slice/crio-f0a9670c85a4b967819fa92d234eb64ddaa831880aca5c021717e2c596081bc6 WatchSource:0}: Error finding container f0a9670c85a4b967819fa92d234eb64ddaa831880aca5c021717e2c596081bc6: Status 404 returned error can't find the container with id f0a9670c85a4b967819fa92d234eb64ddaa831880aca5c021717e2c596081bc6 Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.376678 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.395432 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.406971 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.421172 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-run-netns\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.421218 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-etc-openvswitch\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.421245 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5339013-9544-4e7e-a449-c257f1086638-ovn-node-metrics-cert\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.421271 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-slash\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.421294 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-var-lib-openvswitch\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.421317 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-run-netns\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.421380 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-etc-openvswitch\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.422013 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-ovnkube-script-lib\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.421331 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-ovnkube-script-lib\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.422072 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-cni-bin\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.422088 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-cni-netd\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.422118 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-openvswitch\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.422133 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grsbm\" (UniqueName: \"kubernetes.io/projected/e5339013-9544-4e7e-a449-c257f1086638-kube-api-access-grsbm\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.422157 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-systemd-units\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.422160 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-cni-netd\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.422195 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-log-socket\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.422176 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-log-socket\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.422131 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-slash\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.422396 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-cni-bin\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.422431 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-ovnkube-config\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.422494 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-systemd-units\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.422698 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-openvswitch\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.423272 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-ovnkube-config\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.423322 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-env-overrides\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.422262 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-var-lib-openvswitch\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.423431 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-systemd\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.423513 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-node-log\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.423548 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-ovn\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.423566 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-run-ovn-kubernetes\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.423591 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-kubelet\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.423609 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.423659 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.423685 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-node-log\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.423706 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-ovn\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.423726 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-run-ovn-kubernetes\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.423747 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-kubelet\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.423757 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-env-overrides\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.424480 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-systemd\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.425749 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.426189 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5339013-9544-4e7e-a449-c257f1086638-ovn-node-metrics-cert\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.443738 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grsbm\" (UniqueName: \"kubernetes.io/projected/e5339013-9544-4e7e-a449-c257f1086638-kube-api-access-grsbm\") pod \"ovnkube-node-dtnl8\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.473087 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.484007 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.484547 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.487049 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.488038 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.489347 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.489967 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.490741 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.492690 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.493616 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.495097 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.495914 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.497792 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.498623 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.499788 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.501414 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.502566 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.503849 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.504400 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.505258 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.506456 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.507058 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.508180 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.509201 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.509942 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.510834 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.511610 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.512755 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.514565 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.515429 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.517061 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.518282 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.519826 4907 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.519943 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.522639 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.524099 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.524627 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.526239 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.527071 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.528369 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.529079 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.530788 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.531571 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.532756 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.533471 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.534664 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.535194 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.536271 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.536905 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.538815 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.539555 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.540849 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.541391 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.542618 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.543398 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.543953 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.552773 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.589304 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.592950 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ad16323c0ebfbcc052f387d0634484e4e260fa4e800bcbeaac2b40381c8a5b9a"} Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.594886 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9"} Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.595005 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84"} Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.595068 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8a082944a9bd35bf417767ccbb37f8074b4c3e67dde2e2099526f3e78cf88f2f"} Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.596672 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178"} Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.596720 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"141061e79cd0453a04225ce6545663f1ff8205a33fcfa51e6a7144f0ad196d11"} Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.599101 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.601098 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297"} Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.601337 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.602505 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5zvb" event={"ID":"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4","Type":"ContainerStarted","Data":"bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717"} Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.602548 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5zvb" event={"ID":"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4","Type":"ContainerStarted","Data":"ae6d5120293205af9f59874b204cd3c0d245b87b79a857fb59f8f2b35aea367d"} Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.603922 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" event={"ID":"27b9dc6c-d485-4b7b-94b1-e71337539997","Type":"ContainerStarted","Data":"f0a9670c85a4b967819fa92d234eb64ddaa831880aca5c021717e2c596081bc6"} Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.607176 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.608416 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3"} Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.608957 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f"} Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.609046 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"2e26569df69750bbac444551783ad88e7473acf99443e7662078cb86974234ee"} Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.612650 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c92rh" event={"ID":"0ff04d25-6931-42f8-af97-0f231dfb8d55","Type":"ContainerStarted","Data":"6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797"} Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.612760 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c92rh" event={"ID":"0ff04d25-6931-42f8-af97-0f231dfb8d55","Type":"ContainerStarted","Data":"633f63cac0b3eec74159e4f19db37ce28c10889a5d610359a92699c633d9bcac"} Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.645992 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.675706 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.715187 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.761662 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.794425 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.839194 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.870056 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.912387 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.953664 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:42 crc kubenswrapper[4907]: I1129 14:28:42.995251 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.040425 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.076927 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.112032 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.162927 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.185430 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.192967 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.224742 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.233512 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.233618 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.233647 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.233665 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.233693 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:28:45.233668899 +0000 UTC m=+23.220506561 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.233751 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.233770 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.233787 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.233797 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.233839 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:45.233826853 +0000 UTC m=+23.220664505 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.233851 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.233887 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.233897 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.233921 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.233844 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.233928 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:45.233919445 +0000 UTC m=+23.220757107 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.233996 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:45.233984137 +0000 UTC m=+23.220821799 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.234023 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:45.234003758 +0000 UTC m=+23.220841420 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.243201 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.283039 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.286176 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.302751 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.351025 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.362382 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.383080 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.403463 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.442872 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.462357 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.478863 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.479126 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.478925 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.479588 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.478898 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.479752 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.482858 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.510078 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.522226 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.542407 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.562482 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.583170 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.617076 4907 generic.go:334] "Generic (PLEG): container finished" podID="27b9dc6c-d485-4b7b-94b1-e71337539997" containerID="178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22" exitCode=0 Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.617170 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" event={"ID":"27b9dc6c-d485-4b7b-94b1-e71337539997","Type":"ContainerDied","Data":"178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22"} Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.618953 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5339013-9544-4e7e-a449-c257f1086638" containerID="4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76" exitCode=0 Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.619586 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerDied","Data":"4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76"} Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.619625 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerStarted","Data":"24a5b3a2a144c6df76a28e440dd384c01ef566ae2cf415a58902c213034e790c"} Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.623133 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.642860 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.680340 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.682877 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.707609 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.742276 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.763348 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.792703 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.794014 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.795231 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.795291 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.795311 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.795489 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.858737 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.863796 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.883091 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.903823 4907 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.904617 4907 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.906815 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.906868 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.906880 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.906903 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.906919 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:43Z","lastTransitionTime":"2025-11-29T14:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.930395 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.934969 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.935008 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.935017 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.935034 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.935049 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:43Z","lastTransitionTime":"2025-11-29T14:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.948953 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.953246 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.953277 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.953287 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.953303 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.953314 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:43Z","lastTransitionTime":"2025-11-29T14:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.955312 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.967372 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.971331 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.971363 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.971375 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.971391 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.971405 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:43Z","lastTransitionTime":"2025-11-29T14:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:43 crc kubenswrapper[4907]: E1129 14:28:43.985233 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.989692 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.989748 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.989763 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.989787 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:43 crc kubenswrapper[4907]: I1129 14:28:43.989803 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:43Z","lastTransitionTime":"2025-11-29T14:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.001150 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:43Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: E1129 14:28:44.011537 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: E1129 14:28:44.011691 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.014125 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.014177 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.014195 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.014215 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.014234 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:44Z","lastTransitionTime":"2025-11-29T14:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.032905 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.073384 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.112785 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.116588 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.116620 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.116633 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.116658 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.116676 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:44Z","lastTransitionTime":"2025-11-29T14:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.163373 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.197425 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.212571 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-vg6gc"] Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.213166 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vg6gc" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.220063 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.220119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.220133 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.220166 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.220182 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:44Z","lastTransitionTime":"2025-11-29T14:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.235556 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.242791 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.248335 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0767139-51fc-4c53-aa4d-c52b815fcc81-host\") pod \"node-ca-vg6gc\" (UID: \"c0767139-51fc-4c53-aa4d-c52b815fcc81\") " pod="openshift-image-registry/node-ca-vg6gc" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.248367 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slcqw\" (UniqueName: \"kubernetes.io/projected/c0767139-51fc-4c53-aa4d-c52b815fcc81-kube-api-access-slcqw\") pod \"node-ca-vg6gc\" (UID: \"c0767139-51fc-4c53-aa4d-c52b815fcc81\") " pod="openshift-image-registry/node-ca-vg6gc" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.248393 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c0767139-51fc-4c53-aa4d-c52b815fcc81-serviceca\") pod \"node-ca-vg6gc\" (UID: \"c0767139-51fc-4c53-aa4d-c52b815fcc81\") " pod="openshift-image-registry/node-ca-vg6gc" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.262164 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.282751 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.303621 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.333075 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.333115 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.333132 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.333154 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.333170 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:44Z","lastTransitionTime":"2025-11-29T14:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.349844 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0767139-51fc-4c53-aa4d-c52b815fcc81-host\") pod \"node-ca-vg6gc\" (UID: \"c0767139-51fc-4c53-aa4d-c52b815fcc81\") " pod="openshift-image-registry/node-ca-vg6gc" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.349902 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slcqw\" (UniqueName: \"kubernetes.io/projected/c0767139-51fc-4c53-aa4d-c52b815fcc81-kube-api-access-slcqw\") pod \"node-ca-vg6gc\" (UID: \"c0767139-51fc-4c53-aa4d-c52b815fcc81\") " pod="openshift-image-registry/node-ca-vg6gc" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.349940 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c0767139-51fc-4c53-aa4d-c52b815fcc81-serviceca\") pod \"node-ca-vg6gc\" (UID: \"c0767139-51fc-4c53-aa4d-c52b815fcc81\") " pod="openshift-image-registry/node-ca-vg6gc" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.350040 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0767139-51fc-4c53-aa4d-c52b815fcc81-host\") pod \"node-ca-vg6gc\" (UID: \"c0767139-51fc-4c53-aa4d-c52b815fcc81\") " pod="openshift-image-registry/node-ca-vg6gc" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.350930 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c0767139-51fc-4c53-aa4d-c52b815fcc81-serviceca\") pod \"node-ca-vg6gc\" (UID: \"c0767139-51fc-4c53-aa4d-c52b815fcc81\") " pod="openshift-image-registry/node-ca-vg6gc" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.354388 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.389090 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slcqw\" (UniqueName: \"kubernetes.io/projected/c0767139-51fc-4c53-aa4d-c52b815fcc81-kube-api-access-slcqw\") pod \"node-ca-vg6gc\" (UID: \"c0767139-51fc-4c53-aa4d-c52b815fcc81\") " pod="openshift-image-registry/node-ca-vg6gc" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.416294 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.435553 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.435607 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.435621 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.435643 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.435659 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:44Z","lastTransitionTime":"2025-11-29T14:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.459545 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.491117 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.526888 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vg6gc" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.537975 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.538615 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.538647 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.538661 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.538681 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.538696 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:44Z","lastTransitionTime":"2025-11-29T14:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.571219 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.615154 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.651093 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.674001 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerStarted","Data":"47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.674101 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerStarted","Data":"6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.674381 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerStarted","Data":"108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.674392 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerStarted","Data":"51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.674402 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerStarted","Data":"b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.674412 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerStarted","Data":"f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.675791 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.675829 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.675839 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.675855 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.675867 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:44Z","lastTransitionTime":"2025-11-29T14:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.677252 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vg6gc" event={"ID":"c0767139-51fc-4c53-aa4d-c52b815fcc81","Type":"ContainerStarted","Data":"20414a2fb83132ff89f55415a9e61860b9afae151a20b5458e8b0e89c7ff1b19"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.683184 4907 generic.go:334] "Generic (PLEG): container finished" podID="27b9dc6c-d485-4b7b-94b1-e71337539997" containerID="5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0" exitCode=0 Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.683252 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" event={"ID":"27b9dc6c-d485-4b7b-94b1-e71337539997","Type":"ContainerDied","Data":"5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.693527 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.733620 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.774875 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.779327 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.779351 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.779359 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.779375 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.779383 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:44Z","lastTransitionTime":"2025-11-29T14:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.827854 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.837055 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.856684 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.857324 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.878655 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.883384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.883417 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.883425 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.883455 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.883466 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:44Z","lastTransitionTime":"2025-11-29T14:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.911568 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.951069 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.987322 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.987391 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.987412 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.987462 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.987481 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:44Z","lastTransitionTime":"2025-11-29T14:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:44 crc kubenswrapper[4907]: I1129 14:28:44.992744 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:44Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.033126 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.076837 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.089068 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.089129 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.089142 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.089165 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.089179 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:45Z","lastTransitionTime":"2025-11-29T14:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.114578 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.151712 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.192139 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.192177 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.192189 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.192206 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.192219 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:45Z","lastTransitionTime":"2025-11-29T14:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.222718 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.254872 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.260152 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.260279 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.260337 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:28:49.260302434 +0000 UTC m=+27.247140096 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.260412 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.260416 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.260480 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.260523 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:49.26050845 +0000 UTC m=+27.247346162 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.260546 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.260467 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.260629 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.260638 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.260650 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.260659 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.260665 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.260674 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.260715 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:49.260705436 +0000 UTC m=+27.247543088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.260770 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:49.260724356 +0000 UTC m=+27.247562008 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.260800 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:49.260790518 +0000 UTC m=+27.247628170 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.285736 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.294973 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.295003 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.295012 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.295027 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.295039 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:45Z","lastTransitionTime":"2025-11-29T14:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.312831 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.352853 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.393200 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.398653 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.398705 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.398724 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.398750 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.398782 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:45Z","lastTransitionTime":"2025-11-29T14:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.432663 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.475310 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.478496 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.478651 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.478523 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.478870 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.479002 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.479236 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.501915 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.502053 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.502141 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.502338 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.502467 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:45Z","lastTransitionTime":"2025-11-29T14:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.513721 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.556505 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.598821 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.605285 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.605503 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.605648 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.605767 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.605888 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:45Z","lastTransitionTime":"2025-11-29T14:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.641541 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.677152 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.689593 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vg6gc" event={"ID":"c0767139-51fc-4c53-aa4d-c52b815fcc81","Type":"ContainerStarted","Data":"400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328"} Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.692260 4907 generic.go:334] "Generic (PLEG): container finished" podID="27b9dc6c-d485-4b7b-94b1-e71337539997" containerID="c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54" exitCode=0 Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.692324 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" event={"ID":"27b9dc6c-d485-4b7b-94b1-e71337539997","Type":"ContainerDied","Data":"c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54"} Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.694706 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30"} Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.708726 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.708779 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.708797 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.708820 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.708838 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:45Z","lastTransitionTime":"2025-11-29T14:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:45 crc kubenswrapper[4907]: E1129 14:28:45.734981 4907 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.738396 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.774301 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.812089 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.812156 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.812174 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.812200 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.812217 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:45Z","lastTransitionTime":"2025-11-29T14:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.814532 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.851183 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.895678 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.914668 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.914710 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.914720 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.914737 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.914749 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:45Z","lastTransitionTime":"2025-11-29T14:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.934114 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:45 crc kubenswrapper[4907]: I1129 14:28:45.972910 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:45Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.016686 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.017246 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.017282 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.017293 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.017313 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.017324 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:46Z","lastTransitionTime":"2025-11-29T14:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.067485 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.098151 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.119945 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.120013 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.120032 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.120056 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.120073 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:46Z","lastTransitionTime":"2025-11-29T14:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.137046 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.175341 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.216593 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.223153 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.223253 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.223278 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.223317 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.223342 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:46Z","lastTransitionTime":"2025-11-29T14:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.256550 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.291897 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.327223 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.327259 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.327270 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.327287 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.327297 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:46Z","lastTransitionTime":"2025-11-29T14:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.336417 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.370910 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.412305 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.430105 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.430141 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.430153 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.430173 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.430186 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:46Z","lastTransitionTime":"2025-11-29T14:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.453139 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.493991 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.532543 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.532584 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.532595 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.532614 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.532627 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:46Z","lastTransitionTime":"2025-11-29T14:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.536924 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.576995 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.618726 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.635733 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.635769 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.635778 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.635794 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.635804 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:46Z","lastTransitionTime":"2025-11-29T14:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.704835 4907 generic.go:334] "Generic (PLEG): container finished" podID="27b9dc6c-d485-4b7b-94b1-e71337539997" containerID="fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f" exitCode=0 Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.704911 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" event={"ID":"27b9dc6c-d485-4b7b-94b1-e71337539997","Type":"ContainerDied","Data":"fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f"} Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.711989 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerStarted","Data":"62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a"} Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.735001 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.738898 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.738988 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.739012 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.739044 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.739066 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:46Z","lastTransitionTime":"2025-11-29T14:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.755900 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.789859 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.809300 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.832028 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.843066 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.843125 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.843144 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.843173 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.843193 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:46Z","lastTransitionTime":"2025-11-29T14:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.852622 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.895565 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.934046 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.937556 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.944597 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.945264 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.945299 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.945311 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.945331 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.945345 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:46Z","lastTransitionTime":"2025-11-29T14:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:46 crc kubenswrapper[4907]: I1129 14:28:46.975785 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.014938 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.048147 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.048191 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.048203 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.048221 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.048258 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:47Z","lastTransitionTime":"2025-11-29T14:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.054565 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.091980 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.133090 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.150941 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.151011 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.151030 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.151056 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.151075 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:47Z","lastTransitionTime":"2025-11-29T14:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.175246 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.219669 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.254283 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.254344 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.254364 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.254390 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.254412 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:47Z","lastTransitionTime":"2025-11-29T14:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.255773 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.294341 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.339588 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.357890 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.357951 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.357969 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.357998 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.358018 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:47Z","lastTransitionTime":"2025-11-29T14:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.374520 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.414618 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.452090 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.460923 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.460997 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.461024 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.461080 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.461105 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:47Z","lastTransitionTime":"2025-11-29T14:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.478568 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.478587 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.478690 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:47 crc kubenswrapper[4907]: E1129 14:28:47.478919 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:28:47 crc kubenswrapper[4907]: E1129 14:28:47.479111 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:28:47 crc kubenswrapper[4907]: E1129 14:28:47.479339 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.496393 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.535568 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.563481 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.563562 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.563581 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.563607 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.563627 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:47Z","lastTransitionTime":"2025-11-29T14:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.589955 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.620578 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.652590 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.666064 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.666113 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.666187 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.666207 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.666222 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:47Z","lastTransitionTime":"2025-11-29T14:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.695081 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.720534 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" event={"ID":"27b9dc6c-d485-4b7b-94b1-e71337539997","Type":"ContainerStarted","Data":"36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0"} Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.736221 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.772079 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.772140 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.772157 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.772182 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.772206 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:47Z","lastTransitionTime":"2025-11-29T14:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.792181 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.817150 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.856474 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.875338 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.875374 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.875386 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.875403 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.875416 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:47Z","lastTransitionTime":"2025-11-29T14:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.893506 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.935920 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.975821 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:47Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.978745 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.978818 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.978835 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.978862 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:47 crc kubenswrapper[4907]: I1129 14:28:47.978878 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:47Z","lastTransitionTime":"2025-11-29T14:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.019102 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.069084 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.082131 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.082192 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.082209 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.082238 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.082257 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:48Z","lastTransitionTime":"2025-11-29T14:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.095918 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.140707 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.176684 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.186002 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.186084 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.186109 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.186139 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.186160 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:48Z","lastTransitionTime":"2025-11-29T14:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.218722 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.258936 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.289320 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.289392 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.289415 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.289494 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.289521 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:48Z","lastTransitionTime":"2025-11-29T14:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.311911 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.338176 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.375302 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.393005 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.393055 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.393070 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.393098 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.393116 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:48Z","lastTransitionTime":"2025-11-29T14:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.419662 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.497392 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.497533 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.497561 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.497612 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.497637 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:48Z","lastTransitionTime":"2025-11-29T14:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.600951 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.601036 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.601061 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.601105 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.601127 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:48Z","lastTransitionTime":"2025-11-29T14:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.703873 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.703919 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.703935 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.703960 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.703978 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:48Z","lastTransitionTime":"2025-11-29T14:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.730895 4907 generic.go:334] "Generic (PLEG): container finished" podID="27b9dc6c-d485-4b7b-94b1-e71337539997" containerID="36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0" exitCode=0 Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.731137 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" event={"ID":"27b9dc6c-d485-4b7b-94b1-e71337539997","Type":"ContainerDied","Data":"36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0"} Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.750990 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.786952 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.804169 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.806583 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.806649 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.806663 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.806683 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.806725 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:48Z","lastTransitionTime":"2025-11-29T14:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.820753 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.833466 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.856827 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.871527 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.884115 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.899029 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.909621 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.909666 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.909678 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.909699 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.909713 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:48Z","lastTransitionTime":"2025-11-29T14:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.909909 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.923616 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.934362 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.943903 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:48 crc kubenswrapper[4907]: I1129 14:28:48.969558 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:48Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.010039 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.011730 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.011764 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.011774 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.011791 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.011802 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:49Z","lastTransitionTime":"2025-11-29T14:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.113879 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.113929 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.113940 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.113957 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.113968 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:49Z","lastTransitionTime":"2025-11-29T14:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.218547 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.218606 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.218626 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.218657 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.218681 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:49Z","lastTransitionTime":"2025-11-29T14:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.305693 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.305812 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.305855 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.305911 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:28:57.305872228 +0000 UTC m=+35.292709900 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.305958 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.305986 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.305980 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.306074 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.306090 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.306013 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:57.306002271 +0000 UTC m=+35.292839923 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.306140 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.306039 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.306209 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:57.306198707 +0000 UTC m=+35.293036379 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.306215 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.306229 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.306247 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.306264 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:57.306254008 +0000 UTC m=+35.293091660 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.306287 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:28:57.306274999 +0000 UTC m=+35.293112651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.320216 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.320944 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.321048 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.321141 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.321223 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:49Z","lastTransitionTime":"2025-11-29T14:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.424785 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.424840 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.424858 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.424881 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.424898 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:49Z","lastTransitionTime":"2025-11-29T14:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.478603 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.478693 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.478788 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.478888 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.478988 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:49 crc kubenswrapper[4907]: E1129 14:28:49.479083 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.527351 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.527416 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.527468 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.527501 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.527519 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:49Z","lastTransitionTime":"2025-11-29T14:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.630982 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.631365 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.631382 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.631411 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.631431 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:49Z","lastTransitionTime":"2025-11-29T14:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.733831 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.733874 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.733886 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.733904 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.733916 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:49Z","lastTransitionTime":"2025-11-29T14:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.737713 4907 generic.go:334] "Generic (PLEG): container finished" podID="27b9dc6c-d485-4b7b-94b1-e71337539997" containerID="6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec" exitCode=0 Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.737805 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" event={"ID":"27b9dc6c-d485-4b7b-94b1-e71337539997","Type":"ContainerDied","Data":"6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec"} Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.742738 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerStarted","Data":"50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0"} Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.742959 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.742983 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.761720 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.772948 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.775836 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.782972 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.792183 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.805911 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.820201 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.831779 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.836213 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.836282 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.836298 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.836318 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.836359 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:49Z","lastTransitionTime":"2025-11-29T14:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.842206 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.856987 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.882981 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.896530 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.909132 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.922611 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.938853 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.938909 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.938927 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.938952 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.938969 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:49Z","lastTransitionTime":"2025-11-29T14:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.945872 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.961005 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.974013 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:49 crc kubenswrapper[4907]: I1129 14:28:49.987616 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:49Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.008000 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.022232 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.034778 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.040679 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.040745 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.040771 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.040801 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.040820 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:50Z","lastTransitionTime":"2025-11-29T14:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.050460 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.061312 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.076194 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.087900 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.101380 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.114034 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.128372 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.137854 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.142545 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.142592 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.142607 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.142625 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.142636 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:50Z","lastTransitionTime":"2025-11-29T14:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.149397 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.171910 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.217089 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.245118 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.245391 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.245550 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.245653 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.245734 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:50Z","lastTransitionTime":"2025-11-29T14:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.348171 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.348557 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.348737 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.348918 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.349095 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:50Z","lastTransitionTime":"2025-11-29T14:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.451727 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.451796 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.451821 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.451854 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.451879 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:50Z","lastTransitionTime":"2025-11-29T14:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.555635 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.555687 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.555704 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.555728 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.555744 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:50Z","lastTransitionTime":"2025-11-29T14:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.658343 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.658393 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.658410 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.658433 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.658484 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:50Z","lastTransitionTime":"2025-11-29T14:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.752605 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.752586 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" event={"ID":"27b9dc6c-d485-4b7b-94b1-e71337539997","Type":"ContainerStarted","Data":"050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1"} Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.761347 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.761414 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.761472 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.761506 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.761530 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:50Z","lastTransitionTime":"2025-11-29T14:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.769146 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.801317 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.823858 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.844983 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.864947 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.865003 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.865020 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.865043 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.865060 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:50Z","lastTransitionTime":"2025-11-29T14:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.876754 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.901034 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.938098 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.962551 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.967425 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.967576 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.967602 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.967631 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.967651 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:50Z","lastTransitionTime":"2025-11-29T14:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:50 crc kubenswrapper[4907]: I1129 14:28:50.991920 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:50Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.017081 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:51Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.044365 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:51Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.065012 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:51Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.070320 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.070359 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.070371 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.070389 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.070402 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:51Z","lastTransitionTime":"2025-11-29T14:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.089783 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:51Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.107377 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:51Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.120528 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:51Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.172552 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.172770 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.172844 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.172923 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.173023 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:51Z","lastTransitionTime":"2025-11-29T14:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.275405 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.275541 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.275557 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.275580 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.275594 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:51Z","lastTransitionTime":"2025-11-29T14:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.378617 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.378668 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.378680 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.378698 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.378710 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:51Z","lastTransitionTime":"2025-11-29T14:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.478744 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.478876 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:51 crc kubenswrapper[4907]: E1129 14:28:51.478929 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:28:51 crc kubenswrapper[4907]: E1129 14:28:51.479065 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.478873 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:51 crc kubenswrapper[4907]: E1129 14:28:51.479241 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.481010 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.481056 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.481072 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.481093 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.481113 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:51Z","lastTransitionTime":"2025-11-29T14:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.584855 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.585453 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.585523 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.585656 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.585755 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:51Z","lastTransitionTime":"2025-11-29T14:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.689545 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.689776 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.689853 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.689978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.690046 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:51Z","lastTransitionTime":"2025-11-29T14:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.756876 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.793136 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.793389 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.793465 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.793579 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.793661 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:51Z","lastTransitionTime":"2025-11-29T14:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.896523 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.896601 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.896629 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.896661 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.896687 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:51Z","lastTransitionTime":"2025-11-29T14:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.999342 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.999409 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.999432 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.999521 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:51 crc kubenswrapper[4907]: I1129 14:28:51.999543 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:51Z","lastTransitionTime":"2025-11-29T14:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.102548 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.102602 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.102620 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.102643 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.102660 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:52Z","lastTransitionTime":"2025-11-29T14:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.205654 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.206020 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.206199 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.206353 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.206520 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:52Z","lastTransitionTime":"2025-11-29T14:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.292542 4907 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.309145 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.309193 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.309202 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.309216 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.309225 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:52Z","lastTransitionTime":"2025-11-29T14:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.411913 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.411963 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.411983 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.412011 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.412030 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:52Z","lastTransitionTime":"2025-11-29T14:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.502354 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.514852 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.514888 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.514900 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.514920 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.514933 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:52Z","lastTransitionTime":"2025-11-29T14:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.525356 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.542038 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.560628 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.584031 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.613521 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.618463 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.618508 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.618528 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.618583 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.618594 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:52Z","lastTransitionTime":"2025-11-29T14:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.632316 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.664211 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.680391 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.697541 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.715360 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.721461 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.721497 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.721508 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.721549 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.721562 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:52Z","lastTransitionTime":"2025-11-29T14:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.731770 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.747244 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.760195 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.764458 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovnkube-controller/0.log" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.770982 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5339013-9544-4e7e-a449-c257f1086638" containerID="50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0" exitCode=1 Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.771027 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerDied","Data":"50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0"} Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.771807 4907 scope.go:117] "RemoveContainer" containerID="50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.778129 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.806943 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:52Z\\\",\\\"message\\\":\\\"reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 14:28:52.027197 6189 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 14:28:52.027415 6189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028009 6189 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028384 6189 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 14:28:52.028488 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 14:28:52.028563 6189 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1129 14:28:52.028587 6189 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 14:28:52.028614 6189 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1129 14:28:52.028644 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 14:28:52.028680 6189 factory.go:656] Stopping watch factory\\\\nI1129 14:28:52.028702 6189 ovnkube.go:599] Stopped ovnkube\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.819659 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.825796 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.825842 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.825865 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.825891 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.825910 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:52Z","lastTransitionTime":"2025-11-29T14:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.842394 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.867005 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.889662 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.909780 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.924344 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.929947 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.930082 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.930099 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.930145 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.930161 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:52Z","lastTransitionTime":"2025-11-29T14:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.945734 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.963149 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:52 crc kubenswrapper[4907]: I1129 14:28:52.986198 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:52Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.005788 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.023371 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.034130 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.034210 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.034230 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.034267 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.034294 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:53Z","lastTransitionTime":"2025-11-29T14:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.038767 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.052506 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.071120 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.137467 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.137533 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.137550 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.137577 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.137600 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:53Z","lastTransitionTime":"2025-11-29T14:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.240420 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.240497 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.240509 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.240531 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.240544 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:53Z","lastTransitionTime":"2025-11-29T14:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.344594 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.344672 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.344695 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.344724 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.344744 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:53Z","lastTransitionTime":"2025-11-29T14:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.447847 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.447959 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.447984 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.448027 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.448054 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:53Z","lastTransitionTime":"2025-11-29T14:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.479413 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.479596 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.479633 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:53 crc kubenswrapper[4907]: E1129 14:28:53.479815 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:28:53 crc kubenswrapper[4907]: E1129 14:28:53.479978 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:28:53 crc kubenswrapper[4907]: E1129 14:28:53.480237 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.551134 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.551196 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.551210 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.551229 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.551249 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:53Z","lastTransitionTime":"2025-11-29T14:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.654782 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.654842 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.654855 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.654879 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.654894 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:53Z","lastTransitionTime":"2025-11-29T14:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.756856 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.756908 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.756920 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.756940 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.756951 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:53Z","lastTransitionTime":"2025-11-29T14:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.777851 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovnkube-controller/0.log" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.782360 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerStarted","Data":"40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb"} Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.782563 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.799593 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.820874 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.836828 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.851613 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.859916 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.859963 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.859976 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.860000 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.860013 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:53Z","lastTransitionTime":"2025-11-29T14:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.868917 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.882997 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.894897 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.918709 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:52Z\\\",\\\"message\\\":\\\"reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 14:28:52.027197 6189 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 14:28:52.027415 6189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028009 6189 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028384 6189 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 14:28:52.028488 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 14:28:52.028563 6189 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1129 14:28:52.028587 6189 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 14:28:52.028614 6189 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1129 14:28:52.028644 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 14:28:52.028680 6189 factory.go:656] Stopping watch factory\\\\nI1129 14:28:52.028702 6189 ovnkube.go:599] Stopped ovnkube\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.935110 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.958778 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.962208 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.962250 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.962262 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.962282 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.962295 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:53Z","lastTransitionTime":"2025-11-29T14:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.973994 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:53 crc kubenswrapper[4907]: I1129 14:28:53.987560 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.004368 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.018523 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.036710 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.064551 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.064597 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.064607 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.064627 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.064639 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:54Z","lastTransitionTime":"2025-11-29T14:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.096697 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.096739 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.096749 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.096768 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.096779 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:54Z","lastTransitionTime":"2025-11-29T14:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:54 crc kubenswrapper[4907]: E1129 14:28:54.117095 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.122559 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.122597 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.122607 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.122620 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.122630 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:54Z","lastTransitionTime":"2025-11-29T14:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:54 crc kubenswrapper[4907]: E1129 14:28:54.142548 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.147837 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.147908 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.147931 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.147961 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.147982 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:54Z","lastTransitionTime":"2025-11-29T14:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:54 crc kubenswrapper[4907]: E1129 14:28:54.168054 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.173234 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.173310 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.173334 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.173384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.173412 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:54Z","lastTransitionTime":"2025-11-29T14:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:54 crc kubenswrapper[4907]: E1129 14:28:54.194290 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.199964 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.200027 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.200045 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.200073 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.200093 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:54Z","lastTransitionTime":"2025-11-29T14:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:54 crc kubenswrapper[4907]: E1129 14:28:54.221599 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: E1129 14:28:54.221842 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.225009 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.225332 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.225612 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.225860 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.226100 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:54Z","lastTransitionTime":"2025-11-29T14:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.227196 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh"] Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.228401 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.230609 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.231837 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.254773 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.275804 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.292369 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.311339 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.329820 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.329886 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.329905 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.329928 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.329946 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:54Z","lastTransitionTime":"2025-11-29T14:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.331199 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.354800 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.364156 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnh52\" (UniqueName: \"kubernetes.io/projected/5cad52eb-140b-46cc-bbe1-fdada0728e67-kube-api-access-jnh52\") pod \"ovnkube-control-plane-749d76644c-2hghh\" (UID: \"5cad52eb-140b-46cc-bbe1-fdada0728e67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.364262 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5cad52eb-140b-46cc-bbe1-fdada0728e67-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2hghh\" (UID: \"5cad52eb-140b-46cc-bbe1-fdada0728e67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.364327 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5cad52eb-140b-46cc-bbe1-fdada0728e67-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2hghh\" (UID: \"5cad52eb-140b-46cc-bbe1-fdada0728e67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.364373 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5cad52eb-140b-46cc-bbe1-fdada0728e67-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2hghh\" (UID: \"5cad52eb-140b-46cc-bbe1-fdada0728e67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.393536 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:52Z\\\",\\\"message\\\":\\\"reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 14:28:52.027197 6189 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 14:28:52.027415 6189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028009 6189 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028384 6189 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 14:28:52.028488 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 14:28:52.028563 6189 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1129 14:28:52.028587 6189 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 14:28:52.028614 6189 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1129 14:28:52.028644 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 14:28:52.028680 6189 factory.go:656] Stopping watch factory\\\\nI1129 14:28:52.028702 6189 ovnkube.go:599] Stopped ovnkube\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.411388 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.426349 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.432416 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.432481 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.432494 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.432512 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.432549 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:54Z","lastTransitionTime":"2025-11-29T14:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.465751 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5cad52eb-140b-46cc-bbe1-fdada0728e67-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2hghh\" (UID: \"5cad52eb-140b-46cc-bbe1-fdada0728e67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.466123 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5cad52eb-140b-46cc-bbe1-fdada0728e67-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2hghh\" (UID: \"5cad52eb-140b-46cc-bbe1-fdada0728e67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.466325 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5cad52eb-140b-46cc-bbe1-fdada0728e67-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2hghh\" (UID: \"5cad52eb-140b-46cc-bbe1-fdada0728e67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.466576 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnh52\" (UniqueName: \"kubernetes.io/projected/5cad52eb-140b-46cc-bbe1-fdada0728e67-kube-api-access-jnh52\") pod \"ovnkube-control-plane-749d76644c-2hghh\" (UID: \"5cad52eb-140b-46cc-bbe1-fdada0728e67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.467610 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5cad52eb-140b-46cc-bbe1-fdada0728e67-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2hghh\" (UID: \"5cad52eb-140b-46cc-bbe1-fdada0728e67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.467668 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5cad52eb-140b-46cc-bbe1-fdada0728e67-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2hghh\" (UID: \"5cad52eb-140b-46cc-bbe1-fdada0728e67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.485085 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5cad52eb-140b-46cc-bbe1-fdada0728e67-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2hghh\" (UID: \"5cad52eb-140b-46cc-bbe1-fdada0728e67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.487890 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.500092 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnh52\" (UniqueName: \"kubernetes.io/projected/5cad52eb-140b-46cc-bbe1-fdada0728e67-kube-api-access-jnh52\") pod \"ovnkube-control-plane-749d76644c-2hghh\" (UID: \"5cad52eb-140b-46cc-bbe1-fdada0728e67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.519091 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.535460 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.535503 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.535516 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.535537 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.535550 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:54Z","lastTransitionTime":"2025-11-29T14:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.542330 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.544418 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.572341 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.588958 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.602056 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.618483 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.640242 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.640275 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.640286 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.640303 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.640315 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:54Z","lastTransitionTime":"2025-11-29T14:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.743653 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.743730 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.743744 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.743761 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.743775 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:54Z","lastTransitionTime":"2025-11-29T14:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.789206 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovnkube-controller/1.log" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.789878 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovnkube-controller/0.log" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.794355 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5339013-9544-4e7e-a449-c257f1086638" containerID="40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb" exitCode=1 Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.794491 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerDied","Data":"40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb"} Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.794682 4907 scope.go:117] "RemoveContainer" containerID="50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.795865 4907 scope.go:117] "RemoveContainer" containerID="40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb" Nov 29 14:28:54 crc kubenswrapper[4907]: E1129 14:28:54.796129 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.796824 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" event={"ID":"5cad52eb-140b-46cc-bbe1-fdada0728e67","Type":"ContainerStarted","Data":"ae8a742083ad47d38a7e5dccb40fa78fbd550ad339fc03b591cdf2a1f83ef715"} Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.819555 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.844825 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.847045 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.847128 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.847147 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.847178 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.847202 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:54Z","lastTransitionTime":"2025-11-29T14:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.867260 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.893086 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.925943 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.948862 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.951217 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.951313 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.951334 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.951364 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.951386 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:54Z","lastTransitionTime":"2025-11-29T14:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.970589 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:54 crc kubenswrapper[4907]: I1129 14:28:54.990572 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:54Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.018318 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.037993 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.054024 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.054071 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.054088 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.054113 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.054132 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:55Z","lastTransitionTime":"2025-11-29T14:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.055095 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.075172 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.093216 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.117923 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.135639 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.156882 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.156965 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.156984 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.157015 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.157034 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:55Z","lastTransitionTime":"2025-11-29T14:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.161543 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:52Z\\\",\\\"message\\\":\\\"reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 14:28:52.027197 6189 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 14:28:52.027415 6189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028009 6189 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028384 6189 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 14:28:52.028488 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 14:28:52.028563 6189 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1129 14:28:52.028587 6189 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 14:28:52.028614 6189 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1129 14:28:52.028644 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 14:28:52.028680 6189 factory.go:656] Stopping watch factory\\\\nI1129 14:28:52.028702 6189 ovnkube.go:599] Stopped ovnkube\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"ndler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z]\\\\nI1129 14:28:53.922264 6355 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 14:28:53.922123 6355 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI1129 14:28:53.922300 6355 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI1129 14:28:53.921921 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-t4jq9 after 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.259697 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.259791 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.259816 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.259851 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.259876 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:55Z","lastTransitionTime":"2025-11-29T14:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.362955 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.363026 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.363045 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.363071 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.363135 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:55Z","lastTransitionTime":"2025-11-29T14:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.465596 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.465661 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.465683 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.465712 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.465733 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:55Z","lastTransitionTime":"2025-11-29T14:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.478923 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.478947 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.478957 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:55 crc kubenswrapper[4907]: E1129 14:28:55.479095 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:28:55 crc kubenswrapper[4907]: E1129 14:28:55.479265 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:28:55 crc kubenswrapper[4907]: E1129 14:28:55.479422 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.569154 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.569223 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.569240 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.569270 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.569288 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:55Z","lastTransitionTime":"2025-11-29T14:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.672077 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.672160 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.672178 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.672200 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.672217 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:55Z","lastTransitionTime":"2025-11-29T14:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.735673 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-25ct5"] Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.736394 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:28:55 crc kubenswrapper[4907]: E1129 14:28:55.736524 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.763912 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.775745 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.775803 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.775823 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.775847 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.775865 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:55Z","lastTransitionTime":"2025-11-29T14:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.784713 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.787402 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs\") pod \"network-metrics-daemon-25ct5\" (UID: \"9f50e55a-d427-4cde-a639-d6c7597e937a\") " pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.787595 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt9sw\" (UniqueName: \"kubernetes.io/projected/9f50e55a-d427-4cde-a639-d6c7597e937a-kube-api-access-wt9sw\") pod \"network-metrics-daemon-25ct5\" (UID: \"9f50e55a-d427-4cde-a639-d6c7597e937a\") " pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.805136 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovnkube-controller/1.log" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.811173 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" event={"ID":"5cad52eb-140b-46cc-bbe1-fdada0728e67","Type":"ContainerStarted","Data":"54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231"} Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.811231 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" event={"ID":"5cad52eb-140b-46cc-bbe1-fdada0728e67","Type":"ContainerStarted","Data":"e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07"} Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.824079 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:52Z\\\",\\\"message\\\":\\\"reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 14:28:52.027197 6189 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 14:28:52.027415 6189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028009 6189 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028384 6189 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 14:28:52.028488 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 14:28:52.028563 6189 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1129 14:28:52.028587 6189 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 14:28:52.028614 6189 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1129 14:28:52.028644 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 14:28:52.028680 6189 factory.go:656] Stopping watch factory\\\\nI1129 14:28:52.028702 6189 ovnkube.go:599] Stopped ovnkube\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"ndler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z]\\\\nI1129 14:28:53.922264 6355 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 14:28:53.922123 6355 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI1129 14:28:53.922300 6355 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI1129 14:28:53.921921 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-t4jq9 after 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.855866 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.877881 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.878752 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.878787 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.878801 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.878824 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.878838 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:55Z","lastTransitionTime":"2025-11-29T14:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.889199 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs\") pod \"network-metrics-daemon-25ct5\" (UID: \"9f50e55a-d427-4cde-a639-d6c7597e937a\") " pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.889392 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt9sw\" (UniqueName: \"kubernetes.io/projected/9f50e55a-d427-4cde-a639-d6c7597e937a-kube-api-access-wt9sw\") pod \"network-metrics-daemon-25ct5\" (UID: \"9f50e55a-d427-4cde-a639-d6c7597e937a\") " pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:28:55 crc kubenswrapper[4907]: E1129 14:28:55.890035 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:28:55 crc kubenswrapper[4907]: E1129 14:28:55.890140 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs podName:9f50e55a-d427-4cde-a639-d6c7597e937a nodeName:}" failed. No retries permitted until 2025-11-29 14:28:56.390111017 +0000 UTC m=+34.376948759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs") pod "network-metrics-daemon-25ct5" (UID: "9f50e55a-d427-4cde-a639-d6c7597e937a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.899137 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.926920 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.927560 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt9sw\" (UniqueName: \"kubernetes.io/projected/9f50e55a-d427-4cde-a639-d6c7597e937a-kube-api-access-wt9sw\") pod \"network-metrics-daemon-25ct5\" (UID: \"9f50e55a-d427-4cde-a639-d6c7597e937a\") " pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.948792 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.966239 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.981508 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.981576 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.981600 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.981632 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.981655 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:55Z","lastTransitionTime":"2025-11-29T14:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:55 crc kubenswrapper[4907]: I1129 14:28:55.983993 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:55Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.003292 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.019613 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.032806 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.045843 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.059045 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.077181 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.084941 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.084991 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.085009 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.085033 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.085050 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:56Z","lastTransitionTime":"2025-11-29T14:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.093414 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.107733 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.124931 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.141895 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.160326 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.175377 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.187821 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.187878 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.187898 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.187923 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.187943 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:56Z","lastTransitionTime":"2025-11-29T14:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.188491 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.202162 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.220163 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.243225 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:52Z\\\",\\\"message\\\":\\\"reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 14:28:52.027197 6189 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 14:28:52.027415 6189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028009 6189 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028384 6189 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 14:28:52.028488 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 14:28:52.028563 6189 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1129 14:28:52.028587 6189 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 14:28:52.028614 6189 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1129 14:28:52.028644 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 14:28:52.028680 6189 factory.go:656] Stopping watch factory\\\\nI1129 14:28:52.028702 6189 ovnkube.go:599] Stopped ovnkube\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"ndler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z]\\\\nI1129 14:28:53.922264 6355 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 14:28:53.922123 6355 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI1129 14:28:53.922300 6355 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI1129 14:28:53.921921 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-t4jq9 after 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.258478 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.282728 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.291908 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.291970 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.291989 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.292014 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.292037 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:56Z","lastTransitionTime":"2025-11-29T14:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.318310 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.340100 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.359152 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.377016 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.395311 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs\") pod \"network-metrics-daemon-25ct5\" (UID: \"9f50e55a-d427-4cde-a639-d6c7597e937a\") " pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:28:56 crc kubenswrapper[4907]: E1129 14:28:56.395524 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:28:56 crc kubenswrapper[4907]: E1129 14:28:56.395589 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs podName:9f50e55a-d427-4cde-a639-d6c7597e937a nodeName:}" failed. No retries permitted until 2025-11-29 14:28:57.395569746 +0000 UTC m=+35.382407408 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs") pod "network-metrics-daemon-25ct5" (UID: "9f50e55a-d427-4cde-a639-d6c7597e937a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.396215 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.396313 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.396341 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.396381 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.396419 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:56Z","lastTransitionTime":"2025-11-29T14:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.400004 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.420529 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:56Z is after 2025-08-24T17:21:41Z" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.500258 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.500330 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.500349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.500376 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.500394 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:56Z","lastTransitionTime":"2025-11-29T14:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.603579 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.603637 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.603655 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.603680 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.603697 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:56Z","lastTransitionTime":"2025-11-29T14:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.706754 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.706834 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.706857 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.706885 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.706904 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:56Z","lastTransitionTime":"2025-11-29T14:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.809766 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.809817 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.809833 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.809858 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.809875 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:56Z","lastTransitionTime":"2025-11-29T14:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.912594 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.912649 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.912678 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.912702 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:56 crc kubenswrapper[4907]: I1129 14:28:56.912720 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:56Z","lastTransitionTime":"2025-11-29T14:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.015711 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.015751 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.015762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.015780 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.015790 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:57Z","lastTransitionTime":"2025-11-29T14:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.118227 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.118286 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.118302 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.118326 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.118344 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:57Z","lastTransitionTime":"2025-11-29T14:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.221012 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.221072 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.221089 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.221119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.221137 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:57Z","lastTransitionTime":"2025-11-29T14:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.323175 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.323238 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.323263 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.323294 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.323317 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:57Z","lastTransitionTime":"2025-11-29T14:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.406656 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.406855 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs\") pod \"network-metrics-daemon-25ct5\" (UID: \"9f50e55a-d427-4cde-a639-d6c7597e937a\") " pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.406917 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.406972 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.407025 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.407054 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:29:13.407015959 +0000 UTC m=+51.393853641 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.407101 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs podName:9f50e55a-d427-4cde-a639-d6c7597e937a nodeName:}" failed. No retries permitted until 2025-11-29 14:28:59.407085801 +0000 UTC m=+37.393923483 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs") pod "network-metrics-daemon-25ct5" (UID: "9f50e55a-d427-4cde-a639-d6c7597e937a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.407183 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.407195 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.407246 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.407269 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.407093 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.407271 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:29:13.407243015 +0000 UTC m=+51.394080707 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.407361 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.407365 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.407392 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.407411 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.407493 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 14:29:13.40743532 +0000 UTC m=+51.394273072 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.407533 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 14:29:13.407515953 +0000 UTC m=+51.394353655 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.407582 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.407663 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:29:13.407638136 +0000 UTC m=+51.394475848 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.426058 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.426110 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.426127 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.426152 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.426169 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:57Z","lastTransitionTime":"2025-11-29T14:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.479413 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.479643 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.479687 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.479754 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.479852 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.479914 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.479970 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:28:57 crc kubenswrapper[4907]: E1129 14:28:57.480105 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.528839 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.528979 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.528998 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.529022 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.529039 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:57Z","lastTransitionTime":"2025-11-29T14:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.631192 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.631261 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.631285 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.631316 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.631336 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:57Z","lastTransitionTime":"2025-11-29T14:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.734133 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.734187 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.734205 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.734227 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.734247 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:57Z","lastTransitionTime":"2025-11-29T14:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.836419 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.836496 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.836510 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.836530 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.836545 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:57Z","lastTransitionTime":"2025-11-29T14:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.939368 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.939409 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.939421 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.939467 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:57 crc kubenswrapper[4907]: I1129 14:28:57.939480 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:57Z","lastTransitionTime":"2025-11-29T14:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.042045 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.042088 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.042100 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.042119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.042132 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:58Z","lastTransitionTime":"2025-11-29T14:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.144698 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.144742 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.144756 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.144773 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.144786 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:58Z","lastTransitionTime":"2025-11-29T14:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.258097 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.258176 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.258195 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.258218 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.258237 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:58Z","lastTransitionTime":"2025-11-29T14:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.361023 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.361067 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.361088 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.361111 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.361127 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:58Z","lastTransitionTime":"2025-11-29T14:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.463284 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.463317 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.463330 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.463346 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.463357 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:58Z","lastTransitionTime":"2025-11-29T14:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.565998 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.566032 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.566052 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.566069 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.566078 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:58Z","lastTransitionTime":"2025-11-29T14:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.669944 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.670014 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.670032 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.670059 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.670077 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:58Z","lastTransitionTime":"2025-11-29T14:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.773396 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.773502 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.773527 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.773552 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.773569 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:58Z","lastTransitionTime":"2025-11-29T14:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.875978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.876017 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.876029 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.876044 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.876057 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:58Z","lastTransitionTime":"2025-11-29T14:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.979080 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.979140 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.979159 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.979186 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:58 crc kubenswrapper[4907]: I1129 14:28:58.979203 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:58Z","lastTransitionTime":"2025-11-29T14:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.082339 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.082406 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.082426 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.082487 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.082507 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:59Z","lastTransitionTime":"2025-11-29T14:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.186162 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.186556 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.186796 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.186968 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.187338 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:59Z","lastTransitionTime":"2025-11-29T14:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.290294 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.290346 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.290365 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.290393 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.290412 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:59Z","lastTransitionTime":"2025-11-29T14:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.426587 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs\") pod \"network-metrics-daemon-25ct5\" (UID: \"9f50e55a-d427-4cde-a639-d6c7597e937a\") " pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:28:59 crc kubenswrapper[4907]: E1129 14:28:59.426917 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:28:59 crc kubenswrapper[4907]: E1129 14:28:59.427070 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs podName:9f50e55a-d427-4cde-a639-d6c7597e937a nodeName:}" failed. No retries permitted until 2025-11-29 14:29:03.427032574 +0000 UTC m=+41.413870306 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs") pod "network-metrics-daemon-25ct5" (UID: "9f50e55a-d427-4cde-a639-d6c7597e937a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.427304 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.427340 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.427353 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.427372 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.427383 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:59Z","lastTransitionTime":"2025-11-29T14:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.479417 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.479511 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.479526 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.479421 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:28:59 crc kubenswrapper[4907]: E1129 14:28:59.479672 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:28:59 crc kubenswrapper[4907]: E1129 14:28:59.479812 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:28:59 crc kubenswrapper[4907]: E1129 14:28:59.479907 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:28:59 crc kubenswrapper[4907]: E1129 14:28:59.479976 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.530391 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.530698 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.530727 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.530753 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.530771 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:59Z","lastTransitionTime":"2025-11-29T14:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.634227 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.634282 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.634299 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.634323 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.634341 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:59Z","lastTransitionTime":"2025-11-29T14:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.737987 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.738055 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.738080 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.738114 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.738136 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:59Z","lastTransitionTime":"2025-11-29T14:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.841953 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.842027 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.842053 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.842084 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.842108 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:59Z","lastTransitionTime":"2025-11-29T14:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.944679 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.944731 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.944746 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.944769 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:28:59 crc kubenswrapper[4907]: I1129 14:28:59.944787 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:28:59Z","lastTransitionTime":"2025-11-29T14:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.047639 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.047681 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.047698 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.047720 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.047737 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:00Z","lastTransitionTime":"2025-11-29T14:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.150679 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.150976 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.151108 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.151261 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.151386 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:00Z","lastTransitionTime":"2025-11-29T14:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.254432 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.254549 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.254567 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.254594 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.254612 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:00Z","lastTransitionTime":"2025-11-29T14:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.356600 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.356906 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.357049 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.357176 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.357318 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:00Z","lastTransitionTime":"2025-11-29T14:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.459977 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.460304 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.460483 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.460790 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.460935 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:00Z","lastTransitionTime":"2025-11-29T14:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.564325 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.564395 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.564413 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.564464 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.564484 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:00Z","lastTransitionTime":"2025-11-29T14:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.668069 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.668328 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.668526 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.668703 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.668897 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:00Z","lastTransitionTime":"2025-11-29T14:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.709290 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.728588 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:00Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.747655 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:00Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.770165 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:00Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.772401 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.772608 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.772742 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.772860 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.772986 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:00Z","lastTransitionTime":"2025-11-29T14:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.791698 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:00Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.814671 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:00Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.829432 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:00Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.846355 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:00Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.868917 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:00Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.876088 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.876145 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.876165 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.876191 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.876209 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:00Z","lastTransitionTime":"2025-11-29T14:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.890156 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:00Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.920200 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:52Z\\\",\\\"message\\\":\\\"reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 14:28:52.027197 6189 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 14:28:52.027415 6189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028009 6189 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028384 6189 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 14:28:52.028488 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 14:28:52.028563 6189 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1129 14:28:52.028587 6189 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 14:28:52.028614 6189 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1129 14:28:52.028644 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 14:28:52.028680 6189 factory.go:656] Stopping watch factory\\\\nI1129 14:28:52.028702 6189 ovnkube.go:599] Stopped ovnkube\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"ndler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z]\\\\nI1129 14:28:53.922264 6355 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 14:28:53.922123 6355 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI1129 14:28:53.922300 6355 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI1129 14:28:53.921921 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-t4jq9 after 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:00Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.937420 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:00Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.970561 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:00Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.978931 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.978998 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.979018 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.979045 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.979063 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:00Z","lastTransitionTime":"2025-11-29T14:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:00 crc kubenswrapper[4907]: I1129 14:29:00.989106 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:00Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.011422 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:01Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.030622 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:01Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.049140 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:01Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.068606 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:01Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.081691 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.081757 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.081775 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.081802 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.081820 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:01Z","lastTransitionTime":"2025-11-29T14:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.186057 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.186137 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.186152 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.186180 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.186198 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:01Z","lastTransitionTime":"2025-11-29T14:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.288557 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.288616 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.288628 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.288645 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.288657 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:01Z","lastTransitionTime":"2025-11-29T14:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.391667 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.391733 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.391753 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.391783 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.391801 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:01Z","lastTransitionTime":"2025-11-29T14:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.478941 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.478963 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.478981 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:01 crc kubenswrapper[4907]: E1129 14:29:01.479128 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.479147 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:01 crc kubenswrapper[4907]: E1129 14:29:01.479288 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:01 crc kubenswrapper[4907]: E1129 14:29:01.479640 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:01 crc kubenswrapper[4907]: E1129 14:29:01.479725 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.496707 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.496795 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.496823 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.496858 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.496882 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:01Z","lastTransitionTime":"2025-11-29T14:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.599970 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.600031 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.600049 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.600078 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.600096 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:01Z","lastTransitionTime":"2025-11-29T14:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.703306 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.703369 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.703387 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.703414 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.703432 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:01Z","lastTransitionTime":"2025-11-29T14:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.806995 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.807412 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.807591 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.807782 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.807949 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:01Z","lastTransitionTime":"2025-11-29T14:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.911631 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.911923 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.912057 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.912191 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:01 crc kubenswrapper[4907]: I1129 14:29:01.912411 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:01Z","lastTransitionTime":"2025-11-29T14:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.015362 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.016327 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.016531 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.016671 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.016914 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:02Z","lastTransitionTime":"2025-11-29T14:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.120667 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.120737 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.120756 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.120783 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.120802 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:02Z","lastTransitionTime":"2025-11-29T14:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.223394 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.223491 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.223512 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.223535 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.223553 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:02Z","lastTransitionTime":"2025-11-29T14:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.325935 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.326008 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.326028 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.326058 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.326079 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:02Z","lastTransitionTime":"2025-11-29T14:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.430317 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.430869 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.431018 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.431546 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.431632 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:02Z","lastTransitionTime":"2025-11-29T14:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.500935 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.521061 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.537695 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.537797 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.537814 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.537841 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.537863 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:02Z","lastTransitionTime":"2025-11-29T14:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.542419 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.569655 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.588917 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.609328 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.634819 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.643708 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.643752 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.643770 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.643793 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.643813 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:02Z","lastTransitionTime":"2025-11-29T14:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.664098 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50fe75e419a84a6cf8dfc5235bdc5f892be9cb39b8038afbbfe55ce5e16daea0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:52Z\\\",\\\"message\\\":\\\"reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 14:28:52.027197 6189 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 14:28:52.027415 6189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028009 6189 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 14:28:52.028384 6189 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 14:28:52.028488 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 14:28:52.028563 6189 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1129 14:28:52.028587 6189 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 14:28:52.028614 6189 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1129 14:28:52.028644 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 14:28:52.028680 6189 factory.go:656] Stopping watch factory\\\\nI1129 14:28:52.028702 6189 ovnkube.go:599] Stopped ovnkube\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"ndler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z]\\\\nI1129 14:28:53.922264 6355 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 14:28:53.922123 6355 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI1129 14:28:53.922300 6355 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI1129 14:28:53.921921 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-t4jq9 after 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.693279 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.719507 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.743329 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.747842 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.747893 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.747910 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.747933 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.747950 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:02Z","lastTransitionTime":"2025-11-29T14:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.766183 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.784920 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.805814 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.830036 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.850480 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.851583 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.851693 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.851728 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.851764 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.851788 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:02Z","lastTransitionTime":"2025-11-29T14:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.866060 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:02Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.955308 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.955361 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.955373 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.955395 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:02 crc kubenswrapper[4907]: I1129 14:29:02.955408 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:02Z","lastTransitionTime":"2025-11-29T14:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.058897 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.058989 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.059018 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.059057 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.059103 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:03Z","lastTransitionTime":"2025-11-29T14:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.162493 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.162580 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.162599 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.162632 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.162652 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:03Z","lastTransitionTime":"2025-11-29T14:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.265840 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.265912 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.265929 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.265952 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.265968 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:03Z","lastTransitionTime":"2025-11-29T14:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.374057 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.374147 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.374171 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.374200 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.374226 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:03Z","lastTransitionTime":"2025-11-29T14:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.474065 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs\") pod \"network-metrics-daemon-25ct5\" (UID: \"9f50e55a-d427-4cde-a639-d6c7597e937a\") " pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:03 crc kubenswrapper[4907]: E1129 14:29:03.474335 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:29:03 crc kubenswrapper[4907]: E1129 14:29:03.474422 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs podName:9f50e55a-d427-4cde-a639-d6c7597e937a nodeName:}" failed. No retries permitted until 2025-11-29 14:29:11.474398317 +0000 UTC m=+49.461236009 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs") pod "network-metrics-daemon-25ct5" (UID: "9f50e55a-d427-4cde-a639-d6c7597e937a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.477384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.477481 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.477520 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.477547 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.477564 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:03Z","lastTransitionTime":"2025-11-29T14:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.478742 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.478757 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.478859 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.478887 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:03 crc kubenswrapper[4907]: E1129 14:29:03.479053 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:03 crc kubenswrapper[4907]: E1129 14:29:03.479200 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:03 crc kubenswrapper[4907]: E1129 14:29:03.479489 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:03 crc kubenswrapper[4907]: E1129 14:29:03.479554 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.580084 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.580140 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.580153 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.580176 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.580192 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:03Z","lastTransitionTime":"2025-11-29T14:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.683119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.683190 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.683210 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.683234 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.683253 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:03Z","lastTransitionTime":"2025-11-29T14:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.786501 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.786634 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.786661 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.786694 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.786717 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:03Z","lastTransitionTime":"2025-11-29T14:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.889478 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.889539 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.889555 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.889578 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.889596 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:03Z","lastTransitionTime":"2025-11-29T14:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.992091 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.992143 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.992158 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.992181 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:03 crc kubenswrapper[4907]: I1129 14:29:03.992200 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:03Z","lastTransitionTime":"2025-11-29T14:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.096715 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.096763 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.096776 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.096796 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.096814 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:04Z","lastTransitionTime":"2025-11-29T14:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.201595 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.201667 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.201685 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.201715 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.201736 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:04Z","lastTransitionTime":"2025-11-29T14:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.305709 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.305810 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.305835 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.305866 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.305887 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:04Z","lastTransitionTime":"2025-11-29T14:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.409153 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.409229 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.409248 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.409276 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.409296 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:04Z","lastTransitionTime":"2025-11-29T14:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.511828 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.511909 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.511927 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.511962 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.511979 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:04Z","lastTransitionTime":"2025-11-29T14:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.528164 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.528241 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.528262 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.528294 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.528315 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:04Z","lastTransitionTime":"2025-11-29T14:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:04 crc kubenswrapper[4907]: E1129 14:29:04.551547 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:04Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.556829 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.556888 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.556907 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.557000 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.557018 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:04Z","lastTransitionTime":"2025-11-29T14:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:04 crc kubenswrapper[4907]: E1129 14:29:04.580357 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:04Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.586593 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.586650 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.586669 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.586694 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.586715 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:04Z","lastTransitionTime":"2025-11-29T14:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:04 crc kubenswrapper[4907]: E1129 14:29:04.609371 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:04Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.615679 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.615775 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.615798 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.615824 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.615842 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:04Z","lastTransitionTime":"2025-11-29T14:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:04 crc kubenswrapper[4907]: E1129 14:29:04.634318 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:04Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.638851 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.638995 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.639079 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.639172 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.639213 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:04Z","lastTransitionTime":"2025-11-29T14:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:04 crc kubenswrapper[4907]: E1129 14:29:04.658298 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:04Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:04 crc kubenswrapper[4907]: E1129 14:29:04.658491 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.661252 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.661322 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.661346 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.661381 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.661413 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:04Z","lastTransitionTime":"2025-11-29T14:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.765606 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.765715 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.765737 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.765768 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.765789 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:04Z","lastTransitionTime":"2025-11-29T14:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.868970 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.869030 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.869044 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.869069 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.869088 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:04Z","lastTransitionTime":"2025-11-29T14:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.972014 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.972099 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.972117 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.972145 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:04 crc kubenswrapper[4907]: I1129 14:29:04.972165 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:04Z","lastTransitionTime":"2025-11-29T14:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.078975 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.079067 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.079083 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.079106 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.079127 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:05Z","lastTransitionTime":"2025-11-29T14:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.183131 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.183189 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.183206 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.183229 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.183247 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:05Z","lastTransitionTime":"2025-11-29T14:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.286563 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.286619 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.286635 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.286660 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.286679 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:05Z","lastTransitionTime":"2025-11-29T14:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.389789 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.389844 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.389862 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.389888 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.389906 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:05Z","lastTransitionTime":"2025-11-29T14:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.479015 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.479049 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.479073 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.479019 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:05 crc kubenswrapper[4907]: E1129 14:29:05.479189 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:05 crc kubenswrapper[4907]: E1129 14:29:05.479336 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:05 crc kubenswrapper[4907]: E1129 14:29:05.479468 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:05 crc kubenswrapper[4907]: E1129 14:29:05.479652 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.492276 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.492320 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.492342 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.492371 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.492393 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:05Z","lastTransitionTime":"2025-11-29T14:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.594907 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.595161 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.595269 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.595352 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.595415 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:05Z","lastTransitionTime":"2025-11-29T14:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.699286 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.699374 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.699391 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.699418 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.699461 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:05Z","lastTransitionTime":"2025-11-29T14:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.802937 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.803001 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.803021 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.803080 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.803097 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:05Z","lastTransitionTime":"2025-11-29T14:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.906349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.906399 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.906408 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.906427 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:05 crc kubenswrapper[4907]: I1129 14:29:05.906451 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:05Z","lastTransitionTime":"2025-11-29T14:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.008978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.009044 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.009062 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.009092 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.009111 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:06Z","lastTransitionTime":"2025-11-29T14:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.112100 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.112161 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.112179 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.112208 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.112227 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:06Z","lastTransitionTime":"2025-11-29T14:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.215418 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.215515 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.215533 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.215560 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.215579 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:06Z","lastTransitionTime":"2025-11-29T14:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.318676 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.318756 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.318775 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.318805 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.318827 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:06Z","lastTransitionTime":"2025-11-29T14:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.421693 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.421753 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.421770 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.421798 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.421816 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:06Z","lastTransitionTime":"2025-11-29T14:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.524848 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.524917 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.524929 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.524955 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.524973 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:06Z","lastTransitionTime":"2025-11-29T14:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.628142 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.628226 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.628243 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.628277 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.628296 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:06Z","lastTransitionTime":"2025-11-29T14:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.731627 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.731704 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.731724 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.731751 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.731771 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:06Z","lastTransitionTime":"2025-11-29T14:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.834688 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.834765 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.834784 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.834813 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.834832 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:06Z","lastTransitionTime":"2025-11-29T14:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.938847 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.938928 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.938943 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.938970 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:06 crc kubenswrapper[4907]: I1129 14:29:06.938992 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:06Z","lastTransitionTime":"2025-11-29T14:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.042250 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.042302 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.042313 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.042332 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.042345 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:07Z","lastTransitionTime":"2025-11-29T14:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.146670 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.146754 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.146773 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.146801 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.146835 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:07Z","lastTransitionTime":"2025-11-29T14:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.250207 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.250279 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.250296 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.250324 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.250343 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:07Z","lastTransitionTime":"2025-11-29T14:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.358586 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.358649 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.358669 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.358699 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.358721 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:07Z","lastTransitionTime":"2025-11-29T14:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.461767 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.461824 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.461840 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.461864 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.461880 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:07Z","lastTransitionTime":"2025-11-29T14:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.479502 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.479559 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.479605 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:07 crc kubenswrapper[4907]: E1129 14:29:07.479766 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.480221 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:07 crc kubenswrapper[4907]: E1129 14:29:07.480391 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:07 crc kubenswrapper[4907]: E1129 14:29:07.480486 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:07 crc kubenswrapper[4907]: E1129 14:29:07.480574 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.481485 4907 scope.go:117] "RemoveContainer" containerID="40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.495900 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.512079 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.529385 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.544151 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.557197 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.566049 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.566087 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.566095 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.566116 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.566127 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:07Z","lastTransitionTime":"2025-11-29T14:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.570099 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.582138 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.597855 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.615413 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.643203 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"ndler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z]\\\\nI1129 14:28:53.922264 6355 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 14:28:53.922123 6355 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI1129 14:28:53.922300 6355 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI1129 14:28:53.921921 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-t4jq9 after 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.661964 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.669382 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.669518 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.669576 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.669604 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.669622 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:07Z","lastTransitionTime":"2025-11-29T14:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.685119 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.707472 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.744261 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.770309 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.771899 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.771946 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.771959 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.771978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.771991 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:07Z","lastTransitionTime":"2025-11-29T14:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.787236 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.806224 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.858581 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovnkube-controller/1.log" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.861648 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerStarted","Data":"06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41"} Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.861773 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.874045 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.874093 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.874109 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.874130 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.874147 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:07Z","lastTransitionTime":"2025-11-29T14:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.877744 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.897700 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.924716 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.944885 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.964081 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.977335 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.977376 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.977388 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.977408 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.977422 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:07Z","lastTransitionTime":"2025-11-29T14:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:07 crc kubenswrapper[4907]: I1129 14:29:07.996696 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:07Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.019556 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.036129 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.055529 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.074802 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"ndler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z]\\\\nI1129 14:28:53.922264 6355 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 14:28:53.922123 6355 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI1129 14:28:53.922300 6355 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI1129 14:28:53.921921 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-t4jq9 after 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:29:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.079275 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.079309 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.079319 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.079335 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.079346 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:08Z","lastTransitionTime":"2025-11-29T14:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.087756 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.106055 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.122135 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.137749 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.154998 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.170066 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.181570 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.181600 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.181612 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.181630 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.181641 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:08Z","lastTransitionTime":"2025-11-29T14:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.184551 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.283712 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.283761 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.283774 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.283804 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.283817 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:08Z","lastTransitionTime":"2025-11-29T14:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.386780 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.386829 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.386847 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.386871 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.386889 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:08Z","lastTransitionTime":"2025-11-29T14:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.402974 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.489122 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.489193 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.489211 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.489238 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.489262 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:08Z","lastTransitionTime":"2025-11-29T14:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.592695 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.592746 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.592762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.592784 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.592804 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:08Z","lastTransitionTime":"2025-11-29T14:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.695625 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.695687 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.695704 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.695730 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.695750 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:08Z","lastTransitionTime":"2025-11-29T14:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.799538 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.799587 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.799606 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.799630 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.799648 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:08Z","lastTransitionTime":"2025-11-29T14:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.868194 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovnkube-controller/2.log" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.869368 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovnkube-controller/1.log" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.875266 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5339013-9544-4e7e-a449-c257f1086638" containerID="06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41" exitCode=1 Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.875311 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerDied","Data":"06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41"} Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.875376 4907 scope.go:117] "RemoveContainer" containerID="40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.876930 4907 scope.go:117] "RemoveContainer" containerID="06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41" Nov 29 14:29:08 crc kubenswrapper[4907]: E1129 14:29:08.877278 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.899758 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.902914 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.902976 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.902994 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.903023 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.903047 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:08Z","lastTransitionTime":"2025-11-29T14:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.922416 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.949165 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.969682 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:08 crc kubenswrapper[4907]: I1129 14:29:08.990328 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:08Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.006421 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.006507 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.006527 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.006554 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.006579 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:09Z","lastTransitionTime":"2025-11-29T14:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.012361 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.031083 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.049929 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.072600 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.104691 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40766e90d83054bd25958c4a57f91da6d801a9561e894708f01a6cfd32295adb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"message\\\":\\\"ndler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:28:53Z is after 2025-08-24T17:21:41Z]\\\\nI1129 14:28:53.922264 6355 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1129 14:28:53.922123 6355 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI1129 14:28:53.922300 6355 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI1129 14:28:53.921921 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-t4jq9 after 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:08Z\\\",\\\"message\\\":\\\"troller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1129 14:29:08.497770 6545 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1129 14:29:08.497797 6545 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF1129 14:29:08.497816 6545 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:29:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.109966 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.110058 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.110080 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.110106 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.110126 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:09Z","lastTransitionTime":"2025-11-29T14:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.123073 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.145217 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.169352 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.191789 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.213180 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.213264 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.213284 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.213328 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.213356 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:09Z","lastTransitionTime":"2025-11-29T14:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.215991 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.254167 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.277203 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.316671 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.316740 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.316761 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.316790 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.316813 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:09Z","lastTransitionTime":"2025-11-29T14:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.420212 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.420294 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.420314 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.420344 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.420364 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:09Z","lastTransitionTime":"2025-11-29T14:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.479427 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.479504 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.479644 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:09 crc kubenswrapper[4907]: E1129 14:29:09.479648 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.479664 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:09 crc kubenswrapper[4907]: E1129 14:29:09.479806 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:09 crc kubenswrapper[4907]: E1129 14:29:09.479962 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:09 crc kubenswrapper[4907]: E1129 14:29:09.480051 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.523837 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.523907 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.523950 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.523978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.523997 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:09Z","lastTransitionTime":"2025-11-29T14:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.627799 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.627855 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.627897 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.628008 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.628023 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:09Z","lastTransitionTime":"2025-11-29T14:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.731123 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.731206 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.731228 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.731263 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.731293 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:09Z","lastTransitionTime":"2025-11-29T14:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.834581 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.834663 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.834685 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.835034 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.835249 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:09Z","lastTransitionTime":"2025-11-29T14:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.881324 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovnkube-controller/2.log" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.885878 4907 scope.go:117] "RemoveContainer" containerID="06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41" Nov 29 14:29:09 crc kubenswrapper[4907]: E1129 14:29:09.886141 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.907927 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.940291 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.940349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.940411 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.940463 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.940325 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.940484 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:09Z","lastTransitionTime":"2025-11-29T14:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.959171 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:09 crc kubenswrapper[4907]: I1129 14:29:09.979462 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:09Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.005085 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.025557 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.039732 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.043527 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.043593 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.043611 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.043638 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.043657 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:10Z","lastTransitionTime":"2025-11-29T14:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.051989 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.064042 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.074471 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.084890 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.095380 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.107657 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.119339 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.133349 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.150396 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.150427 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.150452 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.150473 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.150488 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:10Z","lastTransitionTime":"2025-11-29T14:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.154271 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:08Z\\\",\\\"message\\\":\\\"troller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1129 14:29:08.497770 6545 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1129 14:29:08.497797 6545 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF1129 14:29:08.497816 6545 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:29:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.166367 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:10Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.253738 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.253809 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.253828 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.253857 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.253879 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:10Z","lastTransitionTime":"2025-11-29T14:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.357708 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.357772 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.357792 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.357817 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.357843 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:10Z","lastTransitionTime":"2025-11-29T14:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.460073 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.460157 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.460183 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.460214 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.460239 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:10Z","lastTransitionTime":"2025-11-29T14:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.564188 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.564246 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.564262 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.564285 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.564301 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:10Z","lastTransitionTime":"2025-11-29T14:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.667046 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.667107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.667129 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.667157 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.667176 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:10Z","lastTransitionTime":"2025-11-29T14:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.770025 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.770092 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.770113 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.770142 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.770166 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:10Z","lastTransitionTime":"2025-11-29T14:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.872859 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.872924 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.872944 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.873157 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.873178 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:10Z","lastTransitionTime":"2025-11-29T14:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.977097 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.977256 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.977280 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.977312 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:10 crc kubenswrapper[4907]: I1129 14:29:10.977333 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:10Z","lastTransitionTime":"2025-11-29T14:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.079600 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.079659 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.079670 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.079698 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.079712 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:11Z","lastTransitionTime":"2025-11-29T14:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.182679 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.182739 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.182757 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.182788 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.182807 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:11Z","lastTransitionTime":"2025-11-29T14:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.285970 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.286043 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.286063 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.286094 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.286117 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:11Z","lastTransitionTime":"2025-11-29T14:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.388872 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.388934 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.388951 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.388978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.388996 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:11Z","lastTransitionTime":"2025-11-29T14:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.479465 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.479481 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.479596 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:11 crc kubenswrapper[4907]: E1129 14:29:11.479778 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.479873 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:11 crc kubenswrapper[4907]: E1129 14:29:11.479933 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:11 crc kubenswrapper[4907]: E1129 14:29:11.480073 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:11 crc kubenswrapper[4907]: E1129 14:29:11.480187 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.491802 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.491847 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.491863 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.491885 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.491904 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:11Z","lastTransitionTime":"2025-11-29T14:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.573110 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs\") pod \"network-metrics-daemon-25ct5\" (UID: \"9f50e55a-d427-4cde-a639-d6c7597e937a\") " pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:11 crc kubenswrapper[4907]: E1129 14:29:11.573322 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:29:11 crc kubenswrapper[4907]: E1129 14:29:11.573422 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs podName:9f50e55a-d427-4cde-a639-d6c7597e937a nodeName:}" failed. No retries permitted until 2025-11-29 14:29:27.573397129 +0000 UTC m=+65.560234811 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs") pod "network-metrics-daemon-25ct5" (UID: "9f50e55a-d427-4cde-a639-d6c7597e937a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.594393 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.594452 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.594463 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.594482 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.594495 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:11Z","lastTransitionTime":"2025-11-29T14:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.697632 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.697714 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.697734 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.697755 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.697769 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:11Z","lastTransitionTime":"2025-11-29T14:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.800805 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.800991 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.801008 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.801032 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.801050 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:11Z","lastTransitionTime":"2025-11-29T14:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.905081 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.905185 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.905251 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.905286 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:11 crc kubenswrapper[4907]: I1129 14:29:11.905354 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:11Z","lastTransitionTime":"2025-11-29T14:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.008643 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.008729 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.008755 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.008776 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.008792 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:12Z","lastTransitionTime":"2025-11-29T14:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.111812 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.111879 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.111901 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.111951 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.111967 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:12Z","lastTransitionTime":"2025-11-29T14:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.215071 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.215126 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.215144 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.215167 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.215185 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:12Z","lastTransitionTime":"2025-11-29T14:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.318149 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.318308 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.318333 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.318363 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.318381 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:12Z","lastTransitionTime":"2025-11-29T14:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.421485 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.421549 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.421570 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.421598 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.421621 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:12Z","lastTransitionTime":"2025-11-29T14:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.499635 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.519540 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.525505 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.525576 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.525593 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.525619 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.525636 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:12Z","lastTransitionTime":"2025-11-29T14:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.545535 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.568903 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.590141 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.608428 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.625368 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.629117 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.629160 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.629179 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.629204 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.629221 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:12Z","lastTransitionTime":"2025-11-29T14:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.642411 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.658226 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.689037 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:08Z\\\",\\\"message\\\":\\\"troller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1129 14:29:08.497770 6545 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1129 14:29:08.497797 6545 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF1129 14:29:08.497816 6545 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:29:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.707797 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.726221 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.740502 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.740559 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.740579 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.740605 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.740623 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:12Z","lastTransitionTime":"2025-11-29T14:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.743975 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.760210 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.790929 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.810284 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.825849 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:12Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.843375 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.843500 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.843523 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.843551 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.843570 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:12Z","lastTransitionTime":"2025-11-29T14:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.946657 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.946724 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.946743 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.946768 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:12 crc kubenswrapper[4907]: I1129 14:29:12.946786 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:12Z","lastTransitionTime":"2025-11-29T14:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.049099 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.049187 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.049236 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.049261 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.049279 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:13Z","lastTransitionTime":"2025-11-29T14:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.151893 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.151953 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.151968 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.151993 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.152012 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:13Z","lastTransitionTime":"2025-11-29T14:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.254712 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.254767 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.254787 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.254808 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.254827 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:13Z","lastTransitionTime":"2025-11-29T14:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.357908 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.357980 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.358002 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.358028 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.358045 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:13Z","lastTransitionTime":"2025-11-29T14:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.461142 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.461205 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.461222 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.461246 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.461265 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:13Z","lastTransitionTime":"2025-11-29T14:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.479260 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.479344 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.479344 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.479432 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.479637 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.479687 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.479815 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.479965 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.493742 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.493906 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:29:45.493872677 +0000 UTC m=+83.480710359 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.494008 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.494048 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.494099 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.494156 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.494189 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.494281 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:29:45.494251897 +0000 UTC m=+83.481089609 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.494292 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.494324 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.494332 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.494363 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.494385 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.494397 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.494408 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.494344 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:29:45.494329959 +0000 UTC m=+83.481167651 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.494501 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 14:29:45.494482674 +0000 UTC m=+83.481320366 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:29:13 crc kubenswrapper[4907]: E1129 14:29:13.494523 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 14:29:45.494511954 +0000 UTC m=+83.481349636 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.563733 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.563794 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.563814 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.563839 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.563859 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:13Z","lastTransitionTime":"2025-11-29T14:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.667459 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.667521 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.667546 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.667579 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.667603 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:13Z","lastTransitionTime":"2025-11-29T14:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.770532 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.770604 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.770622 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.770652 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.770672 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:13Z","lastTransitionTime":"2025-11-29T14:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.873832 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.873891 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.873907 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.873932 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.873952 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:13Z","lastTransitionTime":"2025-11-29T14:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.977311 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.977419 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.977458 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.977483 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:13 crc kubenswrapper[4907]: I1129 14:29:13.977540 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:13Z","lastTransitionTime":"2025-11-29T14:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.080271 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.080336 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.080354 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.080380 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.080398 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:14Z","lastTransitionTime":"2025-11-29T14:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.183172 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.183248 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.183289 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.183321 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.183349 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:14Z","lastTransitionTime":"2025-11-29T14:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.286702 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.286767 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.286784 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.286809 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.286827 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:14Z","lastTransitionTime":"2025-11-29T14:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.344525 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.359591 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.367172 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.388384 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.389893 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.389960 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.389982 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.390011 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.390033 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:14Z","lastTransitionTime":"2025-11-29T14:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.408783 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.421259 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.440466 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.452321 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.461930 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.474854 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.487779 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.491977 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.492023 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.492041 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.492063 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.492078 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:14Z","lastTransitionTime":"2025-11-29T14:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.505973 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:08Z\\\",\\\"message\\\":\\\"troller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1129 14:29:08.497770 6545 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1129 14:29:08.497797 6545 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF1129 14:29:08.497816 6545 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:29:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.517937 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.537139 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.557777 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.577556 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.594601 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.594633 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.594642 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.594658 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.594669 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:14Z","lastTransitionTime":"2025-11-29T14:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.597156 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.614233 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.632750 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.697674 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.697761 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.697786 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.697820 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.697847 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:14Z","lastTransitionTime":"2025-11-29T14:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.800785 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.800810 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.800818 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.800830 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.800838 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:14Z","lastTransitionTime":"2025-11-29T14:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.899548 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.899579 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.899587 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.899599 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.899607 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:14Z","lastTransitionTime":"2025-11-29T14:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:14 crc kubenswrapper[4907]: E1129 14:29:14.919040 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.922615 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.922665 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.922682 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.922703 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.922720 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:14Z","lastTransitionTime":"2025-11-29T14:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:14 crc kubenswrapper[4907]: E1129 14:29:14.942381 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.946968 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.946994 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.947002 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.947012 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.947021 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:14Z","lastTransitionTime":"2025-11-29T14:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:14 crc kubenswrapper[4907]: E1129 14:29:14.964176 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.971429 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.971477 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.971485 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.971499 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.971508 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:14Z","lastTransitionTime":"2025-11-29T14:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:14 crc kubenswrapper[4907]: E1129 14:29:14.989963 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:14Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.993622 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.993644 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.993652 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.993663 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:14 crc kubenswrapper[4907]: I1129 14:29:14.993672 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:14Z","lastTransitionTime":"2025-11-29T14:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:15 crc kubenswrapper[4907]: E1129 14:29:15.004465 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:15Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:15 crc kubenswrapper[4907]: E1129 14:29:15.004573 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.006726 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.006836 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.006858 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.006888 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.006908 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:15Z","lastTransitionTime":"2025-11-29T14:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.109479 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.109529 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.109547 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.109572 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.109590 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:15Z","lastTransitionTime":"2025-11-29T14:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.212769 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.212830 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.212848 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.212870 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.212887 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:15Z","lastTransitionTime":"2025-11-29T14:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.315177 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.315229 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.315248 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.315272 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.315289 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:15Z","lastTransitionTime":"2025-11-29T14:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.418765 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.418821 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.418842 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.418868 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.418889 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:15Z","lastTransitionTime":"2025-11-29T14:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.479500 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.479530 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.479550 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.479566 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:15 crc kubenswrapper[4907]: E1129 14:29:15.479667 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:15 crc kubenswrapper[4907]: E1129 14:29:15.479764 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:15 crc kubenswrapper[4907]: E1129 14:29:15.479988 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:15 crc kubenswrapper[4907]: E1129 14:29:15.480064 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.521692 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.521759 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.521776 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.521813 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.521835 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:15Z","lastTransitionTime":"2025-11-29T14:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.626025 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.626112 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.626169 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.626201 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.626296 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:15Z","lastTransitionTime":"2025-11-29T14:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.728604 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.728669 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.728686 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.728713 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.728731 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:15Z","lastTransitionTime":"2025-11-29T14:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.831782 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.831838 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.831851 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.831871 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.831884 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:15Z","lastTransitionTime":"2025-11-29T14:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.934780 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.934858 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.934878 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.934909 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:15 crc kubenswrapper[4907]: I1129 14:29:15.934932 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:15Z","lastTransitionTime":"2025-11-29T14:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.037526 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.037608 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.037629 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.037658 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.037683 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:16Z","lastTransitionTime":"2025-11-29T14:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.140922 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.140975 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.140985 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.141003 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.141017 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:16Z","lastTransitionTime":"2025-11-29T14:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.244866 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.244928 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.244947 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.244975 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.244994 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:16Z","lastTransitionTime":"2025-11-29T14:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.348681 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.348746 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.348764 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.348794 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.348812 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:16Z","lastTransitionTime":"2025-11-29T14:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.451546 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.451593 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.451603 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.451622 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.451635 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:16Z","lastTransitionTime":"2025-11-29T14:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.554847 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.554885 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.554896 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.554912 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.554924 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:16Z","lastTransitionTime":"2025-11-29T14:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.658682 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.658799 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.658828 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.658865 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.658891 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:16Z","lastTransitionTime":"2025-11-29T14:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.762541 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.762985 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.763017 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.763055 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.763084 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:16Z","lastTransitionTime":"2025-11-29T14:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.867484 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.867553 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.867574 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.867602 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.867623 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:16Z","lastTransitionTime":"2025-11-29T14:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.971136 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.971208 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.971228 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.971254 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:16 crc kubenswrapper[4907]: I1129 14:29:16.971272 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:16Z","lastTransitionTime":"2025-11-29T14:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.075487 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.075557 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.075582 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.075612 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.075637 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:17Z","lastTransitionTime":"2025-11-29T14:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.179204 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.179278 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.179303 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.179338 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.179367 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:17Z","lastTransitionTime":"2025-11-29T14:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.283197 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.283288 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.283311 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.283339 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.283357 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:17Z","lastTransitionTime":"2025-11-29T14:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.386653 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.386825 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.386851 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.386886 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.386911 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:17Z","lastTransitionTime":"2025-11-29T14:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.479095 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.479239 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.479246 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.479434 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:17 crc kubenswrapper[4907]: E1129 14:29:17.479661 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:17 crc kubenswrapper[4907]: E1129 14:29:17.479886 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:17 crc kubenswrapper[4907]: E1129 14:29:17.479984 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:17 crc kubenswrapper[4907]: E1129 14:29:17.480074 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.491063 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.491121 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.491141 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.491167 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.491188 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:17Z","lastTransitionTime":"2025-11-29T14:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.594559 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.594647 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.594673 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.594711 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.594736 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:17Z","lastTransitionTime":"2025-11-29T14:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.699577 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.699659 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.699679 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.699706 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.699725 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:17Z","lastTransitionTime":"2025-11-29T14:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.803291 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.803361 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.803382 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.803415 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.803498 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:17Z","lastTransitionTime":"2025-11-29T14:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.908629 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.908710 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.908731 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.908759 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:17 crc kubenswrapper[4907]: I1129 14:29:17.908781 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:17Z","lastTransitionTime":"2025-11-29T14:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.012509 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.012582 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.012601 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.012633 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.012652 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:18Z","lastTransitionTime":"2025-11-29T14:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.116816 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.116891 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.116911 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.116940 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.116969 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:18Z","lastTransitionTime":"2025-11-29T14:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.220428 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.220508 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.220519 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.220541 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.220553 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:18Z","lastTransitionTime":"2025-11-29T14:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.324009 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.324105 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.324132 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.324167 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.324191 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:18Z","lastTransitionTime":"2025-11-29T14:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.428264 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.428406 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.428425 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.428527 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.428550 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:18Z","lastTransitionTime":"2025-11-29T14:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.531929 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.532016 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.532039 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.532069 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.532090 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:18Z","lastTransitionTime":"2025-11-29T14:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.635977 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.636045 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.636064 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.636095 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.636116 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:18Z","lastTransitionTime":"2025-11-29T14:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.739840 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.739936 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.739958 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.739991 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.740017 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:18Z","lastTransitionTime":"2025-11-29T14:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.843533 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.843602 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.843620 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.843648 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.843670 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:18Z","lastTransitionTime":"2025-11-29T14:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.947111 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.947178 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.947197 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.947224 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:18 crc kubenswrapper[4907]: I1129 14:29:18.947244 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:18Z","lastTransitionTime":"2025-11-29T14:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.058210 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.058287 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.058311 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.058343 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.058366 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:19Z","lastTransitionTime":"2025-11-29T14:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.162545 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.162616 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.162642 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.162677 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.162697 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:19Z","lastTransitionTime":"2025-11-29T14:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.265849 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.265929 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.265947 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.265982 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.266002 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:19Z","lastTransitionTime":"2025-11-29T14:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.369891 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.369970 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.369991 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.370022 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.370043 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:19Z","lastTransitionTime":"2025-11-29T14:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.473890 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.473965 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.473985 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.474015 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.474035 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:19Z","lastTransitionTime":"2025-11-29T14:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.479249 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.479327 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.479346 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:19 crc kubenswrapper[4907]: E1129 14:29:19.479571 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.479594 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:19 crc kubenswrapper[4907]: E1129 14:29:19.479739 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:19 crc kubenswrapper[4907]: E1129 14:29:19.479830 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:19 crc kubenswrapper[4907]: E1129 14:29:19.480010 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.578059 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.578118 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.578129 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.578153 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.578172 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:19Z","lastTransitionTime":"2025-11-29T14:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.682086 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.682190 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.682214 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.682315 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.682346 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:19Z","lastTransitionTime":"2025-11-29T14:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.785292 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.785354 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.785371 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.785397 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.785418 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:19Z","lastTransitionTime":"2025-11-29T14:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.888370 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.888426 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.888499 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.888524 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.888543 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:19Z","lastTransitionTime":"2025-11-29T14:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.991836 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.991898 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.991917 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.991946 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:19 crc kubenswrapper[4907]: I1129 14:29:19.991967 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:19Z","lastTransitionTime":"2025-11-29T14:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.099060 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.099140 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.099162 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.099192 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.099223 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:20Z","lastTransitionTime":"2025-11-29T14:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.202853 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.202923 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.202941 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.202970 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.202992 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:20Z","lastTransitionTime":"2025-11-29T14:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.307062 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.307165 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.307185 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.307214 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.307297 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:20Z","lastTransitionTime":"2025-11-29T14:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.411187 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.411262 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.411281 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.411313 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.411334 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:20Z","lastTransitionTime":"2025-11-29T14:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.514553 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.514622 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.514643 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.514670 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.514689 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:20Z","lastTransitionTime":"2025-11-29T14:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.619140 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.619242 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.619270 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.619310 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.619336 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:20Z","lastTransitionTime":"2025-11-29T14:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.722984 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.723053 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.723075 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.723107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.723133 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:20Z","lastTransitionTime":"2025-11-29T14:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.827094 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.827164 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.827183 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.827211 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.827228 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:20Z","lastTransitionTime":"2025-11-29T14:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.931043 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.931115 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.931133 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.931161 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:20 crc kubenswrapper[4907]: I1129 14:29:20.931182 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:20Z","lastTransitionTime":"2025-11-29T14:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.034613 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.034676 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.034693 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.034718 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.034737 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:21Z","lastTransitionTime":"2025-11-29T14:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.138485 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.138558 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.138578 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.138605 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.138625 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:21Z","lastTransitionTime":"2025-11-29T14:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.241825 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.241890 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.241908 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.241935 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.241957 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:21Z","lastTransitionTime":"2025-11-29T14:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.345511 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.345599 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.345618 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.345649 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.345669 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:21Z","lastTransitionTime":"2025-11-29T14:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.449244 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.449323 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.449347 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.449382 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.449404 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:21Z","lastTransitionTime":"2025-11-29T14:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.479000 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.479027 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.479142 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.479171 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:21 crc kubenswrapper[4907]: E1129 14:29:21.479419 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:21 crc kubenswrapper[4907]: E1129 14:29:21.479619 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:21 crc kubenswrapper[4907]: E1129 14:29:21.479862 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:21 crc kubenswrapper[4907]: E1129 14:29:21.479968 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.553532 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.553624 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.553645 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.553677 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.553702 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:21Z","lastTransitionTime":"2025-11-29T14:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.657349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.657430 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.657501 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.657538 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.657560 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:21Z","lastTransitionTime":"2025-11-29T14:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.761225 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.761347 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.761373 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.761399 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.761416 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:21Z","lastTransitionTime":"2025-11-29T14:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.864690 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.864746 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.864764 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.864805 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.864824 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:21Z","lastTransitionTime":"2025-11-29T14:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.967414 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.967524 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.967549 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.967580 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:21 crc kubenswrapper[4907]: I1129 14:29:21.967604 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:21Z","lastTransitionTime":"2025-11-29T14:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.070237 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.070305 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.070322 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.070349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.070366 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:22Z","lastTransitionTime":"2025-11-29T14:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.173043 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.173124 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.173148 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.173181 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.173205 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:22Z","lastTransitionTime":"2025-11-29T14:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.276412 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.276492 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.276507 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.276531 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.276546 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:22Z","lastTransitionTime":"2025-11-29T14:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.379795 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.379878 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.379898 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.379929 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.379953 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:22Z","lastTransitionTime":"2025-11-29T14:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.483501 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.483578 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.483593 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.483619 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.483635 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:22Z","lastTransitionTime":"2025-11-29T14:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.499317 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.521615 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.547652 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:08Z\\\",\\\"message\\\":\\\"troller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1129 14:29:08.497770 6545 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1129 14:29:08.497797 6545 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF1129 14:29:08.497816 6545 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:29:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.587597 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.587673 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.587692 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.587718 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.587738 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:22Z","lastTransitionTime":"2025-11-29T14:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.587965 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.617540 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.639293 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.663377 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.683066 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.691262 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.691328 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.691347 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.691381 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.691614 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:22Z","lastTransitionTime":"2025-11-29T14:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.738718 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.773239 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec13974a-428b-4841-9234-f3f70b6f2857\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c842d45e7b04ef536026a952134478e9f8aba8dc779b6bc127d2fc89063af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1818b34bee237f8b9788cae86c3541ecb29f693da7f3008bda027c4fe45618db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd8e9e3c38d0d0710dd8297cc120bf4ec2bf18f297b0dc850513d2096377636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.793934 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.793969 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.793977 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.793992 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.794002 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:22Z","lastTransitionTime":"2025-11-29T14:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.794262 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.808692 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.820307 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.832190 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.859380 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.883640 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.896293 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.896355 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.896370 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.896395 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.896410 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:22Z","lastTransitionTime":"2025-11-29T14:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.905016 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:22 crc kubenswrapper[4907]: I1129 14:29:22.925317 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:22Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.000775 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.000853 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.000872 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.000901 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.000922 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:23Z","lastTransitionTime":"2025-11-29T14:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.104878 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.104953 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.104980 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.105008 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.105031 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:23Z","lastTransitionTime":"2025-11-29T14:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.209544 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.210121 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.210148 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.210189 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.210211 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:23Z","lastTransitionTime":"2025-11-29T14:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.312935 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.313015 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.313035 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.313060 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.313079 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:23Z","lastTransitionTime":"2025-11-29T14:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.416838 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.416914 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.416932 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.416959 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.416976 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:23Z","lastTransitionTime":"2025-11-29T14:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.479045 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.479116 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.479236 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:23 crc kubenswrapper[4907]: E1129 14:29:23.479289 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:23 crc kubenswrapper[4907]: E1129 14:29:23.479363 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.480012 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:23 crc kubenswrapper[4907]: E1129 14:29:23.480101 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.480148 4907 scope.go:117] "RemoveContainer" containerID="06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41" Nov 29 14:29:23 crc kubenswrapper[4907]: E1129 14:29:23.480274 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:23 crc kubenswrapper[4907]: E1129 14:29:23.480397 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.520130 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.520192 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.520211 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.520236 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.520257 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:23Z","lastTransitionTime":"2025-11-29T14:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.624029 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.624088 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.624107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.624132 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.624149 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:23Z","lastTransitionTime":"2025-11-29T14:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.726860 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.726945 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.726964 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.726992 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.727010 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:23Z","lastTransitionTime":"2025-11-29T14:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.830062 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.830154 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.830176 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.830202 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.830219 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:23Z","lastTransitionTime":"2025-11-29T14:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.932205 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.932258 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.932275 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.932299 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:23 crc kubenswrapper[4907]: I1129 14:29:23.932319 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:23Z","lastTransitionTime":"2025-11-29T14:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.034976 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.035035 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.035053 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.035077 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.035094 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:24Z","lastTransitionTime":"2025-11-29T14:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.138428 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.138546 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.138571 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.138603 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.138623 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:24Z","lastTransitionTime":"2025-11-29T14:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.241354 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.241420 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.241508 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.241543 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.241568 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:24Z","lastTransitionTime":"2025-11-29T14:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.343797 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.343854 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.343872 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.343898 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.343916 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:24Z","lastTransitionTime":"2025-11-29T14:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.446937 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.447002 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.447019 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.447047 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.447064 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:24Z","lastTransitionTime":"2025-11-29T14:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.549748 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.549811 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.549828 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.549852 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.549869 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:24Z","lastTransitionTime":"2025-11-29T14:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.652154 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.652218 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.652237 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.652262 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.652281 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:24Z","lastTransitionTime":"2025-11-29T14:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.755763 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.755854 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.755876 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.755905 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.755922 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:24Z","lastTransitionTime":"2025-11-29T14:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.858428 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.858537 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.858548 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.858566 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.858579 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:24Z","lastTransitionTime":"2025-11-29T14:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.961062 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.961118 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.961136 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.961161 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:24 crc kubenswrapper[4907]: I1129 14:29:24.961179 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:24Z","lastTransitionTime":"2025-11-29T14:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.064633 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.064683 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.064702 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.064726 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.064744 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:25Z","lastTransitionTime":"2025-11-29T14:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.167704 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.167760 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.167778 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.167803 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.167819 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:25Z","lastTransitionTime":"2025-11-29T14:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.269175 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.269258 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.269271 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.269292 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.269305 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:25Z","lastTransitionTime":"2025-11-29T14:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:25 crc kubenswrapper[4907]: E1129 14:29:25.291663 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:25Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.297155 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.297213 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.297232 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.297256 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.297275 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:25Z","lastTransitionTime":"2025-11-29T14:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:25 crc kubenswrapper[4907]: E1129 14:29:25.317543 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:25Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.322580 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.322643 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.322661 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.322685 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.322708 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:25Z","lastTransitionTime":"2025-11-29T14:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:25 crc kubenswrapper[4907]: E1129 14:29:25.347652 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:25Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.351785 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.351853 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.351873 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.351897 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.351918 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:25Z","lastTransitionTime":"2025-11-29T14:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:25 crc kubenswrapper[4907]: E1129 14:29:25.374052 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:25Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.379247 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.379291 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.379308 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.379333 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.379352 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:25Z","lastTransitionTime":"2025-11-29T14:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:25 crc kubenswrapper[4907]: E1129 14:29:25.400414 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:25Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:25 crc kubenswrapper[4907]: E1129 14:29:25.400717 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.402059 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.402122 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.402142 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.402169 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.402189 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:25Z","lastTransitionTime":"2025-11-29T14:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.478690 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.478743 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.478767 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:25 crc kubenswrapper[4907]: E1129 14:29:25.478861 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.478903 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:25 crc kubenswrapper[4907]: E1129 14:29:25.479085 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:25 crc kubenswrapper[4907]: E1129 14:29:25.479144 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:25 crc kubenswrapper[4907]: E1129 14:29:25.479238 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.504724 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.504816 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.504841 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.504873 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.504895 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:25Z","lastTransitionTime":"2025-11-29T14:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.607986 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.608081 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.608106 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.608133 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.608156 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:25Z","lastTransitionTime":"2025-11-29T14:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.711237 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.711284 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.711293 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.711306 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.711315 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:25Z","lastTransitionTime":"2025-11-29T14:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.813413 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.813468 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.813481 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.813499 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.813509 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:25Z","lastTransitionTime":"2025-11-29T14:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.915568 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.915596 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.915605 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.915619 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:25 crc kubenswrapper[4907]: I1129 14:29:25.915629 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:25Z","lastTransitionTime":"2025-11-29T14:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.019034 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.019075 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.019083 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.019097 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.019106 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:26Z","lastTransitionTime":"2025-11-29T14:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.121285 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.121365 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.121392 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.121423 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.121479 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:26Z","lastTransitionTime":"2025-11-29T14:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.225322 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.225384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.225407 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.225469 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.225496 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:26Z","lastTransitionTime":"2025-11-29T14:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.327971 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.328066 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.328084 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.328107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.328126 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:26Z","lastTransitionTime":"2025-11-29T14:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.431718 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.431788 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.431831 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.431866 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.431889 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:26Z","lastTransitionTime":"2025-11-29T14:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.535223 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.535279 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.535295 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.535319 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.535334 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:26Z","lastTransitionTime":"2025-11-29T14:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.642030 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.642292 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.642315 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.642741 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.642795 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:26Z","lastTransitionTime":"2025-11-29T14:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.745745 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.745792 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.745806 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.745829 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.745846 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:26Z","lastTransitionTime":"2025-11-29T14:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.848681 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.848737 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.848748 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.848769 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.848784 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:26Z","lastTransitionTime":"2025-11-29T14:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.951535 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.951645 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.951671 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.951700 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:26 crc kubenswrapper[4907]: I1129 14:29:26.951723 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:26Z","lastTransitionTime":"2025-11-29T14:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.055113 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.055214 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.055235 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.055494 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.055530 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:27Z","lastTransitionTime":"2025-11-29T14:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.158306 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.158369 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.158389 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.158416 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.158434 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:27Z","lastTransitionTime":"2025-11-29T14:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.261676 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.262040 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.262204 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.262360 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.262555 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:27Z","lastTransitionTime":"2025-11-29T14:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.365345 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.365732 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.366163 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.366331 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.366512 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:27Z","lastTransitionTime":"2025-11-29T14:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.470052 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.470130 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.470147 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.470177 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.470201 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:27Z","lastTransitionTime":"2025-11-29T14:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.479224 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.479248 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:27 crc kubenswrapper[4907]: E1129 14:29:27.479367 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.479475 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:27 crc kubenswrapper[4907]: E1129 14:29:27.479572 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:27 crc kubenswrapper[4907]: E1129 14:29:27.479703 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.480006 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:27 crc kubenswrapper[4907]: E1129 14:29:27.480694 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.572815 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.572874 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.572889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.572925 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.572947 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:27Z","lastTransitionTime":"2025-11-29T14:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.667653 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs\") pod \"network-metrics-daemon-25ct5\" (UID: \"9f50e55a-d427-4cde-a639-d6c7597e937a\") " pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:27 crc kubenswrapper[4907]: E1129 14:29:27.667905 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:29:27 crc kubenswrapper[4907]: E1129 14:29:27.667978 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs podName:9f50e55a-d427-4cde-a639-d6c7597e937a nodeName:}" failed. No retries permitted until 2025-11-29 14:29:59.667954519 +0000 UTC m=+97.654792211 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs") pod "network-metrics-daemon-25ct5" (UID: "9f50e55a-d427-4cde-a639-d6c7597e937a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.675729 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.675808 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.675827 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.675862 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.675883 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:27Z","lastTransitionTime":"2025-11-29T14:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.779350 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.779908 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.779947 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.779990 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.780024 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:27Z","lastTransitionTime":"2025-11-29T14:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.883544 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.883611 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.883625 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.883647 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.883665 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:27Z","lastTransitionTime":"2025-11-29T14:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.986264 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.986329 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.986345 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.986370 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:27 crc kubenswrapper[4907]: I1129 14:29:27.986386 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:27Z","lastTransitionTime":"2025-11-29T14:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.089646 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.089691 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.089704 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.089728 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.089742 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:28Z","lastTransitionTime":"2025-11-29T14:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.192326 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.192384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.192396 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.192418 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.192453 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:28Z","lastTransitionTime":"2025-11-29T14:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.294901 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.294967 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.294985 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.295013 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.295032 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:28Z","lastTransitionTime":"2025-11-29T14:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.397120 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.397178 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.397191 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.397212 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.397223 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:28Z","lastTransitionTime":"2025-11-29T14:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.500559 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.500592 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.500602 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.500617 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.500627 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:28Z","lastTransitionTime":"2025-11-29T14:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.603049 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.603087 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.603096 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.603112 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.603123 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:28Z","lastTransitionTime":"2025-11-29T14:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.705678 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.705938 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.706020 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.706057 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.706133 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:28Z","lastTransitionTime":"2025-11-29T14:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.808819 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.808874 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.808884 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.808900 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.808911 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:28Z","lastTransitionTime":"2025-11-29T14:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.910918 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.910963 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.910972 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.910989 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.910999 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:28Z","lastTransitionTime":"2025-11-29T14:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.962859 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d5zvb_3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4/kube-multus/0.log" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.962921 4907 generic.go:334] "Generic (PLEG): container finished" podID="3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4" containerID="bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717" exitCode=1 Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.962953 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5zvb" event={"ID":"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4","Type":"ContainerDied","Data":"bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717"} Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.963367 4907 scope.go:117] "RemoveContainer" containerID="bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717" Nov 29 14:29:28 crc kubenswrapper[4907]: I1129 14:29:28.983358 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:28Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.001331 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:28Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.013536 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.013577 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.013667 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.013686 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.013698 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:29Z","lastTransitionTime":"2025-11-29T14:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.019784 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.037267 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:28Z\\\",\\\"message\\\":\\\"2025-11-29T14:28:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c3c22201-d9b8-486a-9acf-51a1ade5667e\\\\n2025-11-29T14:28:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c3c22201-d9b8-486a-9acf-51a1ade5667e to /host/opt/cni/bin/\\\\n2025-11-29T14:28:43Z [verbose] multus-daemon started\\\\n2025-11-29T14:28:43Z [verbose] Readiness Indicator file check\\\\n2025-11-29T14:29:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.049182 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec13974a-428b-4841-9234-f3f70b6f2857\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c842d45e7b04ef536026a952134478e9f8aba8dc779b6bc127d2fc89063af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1818b34bee237f8b9788cae86c3541ecb29f693da7f3008bda027c4fe45618db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd8e9e3c38d0d0710dd8297cc120bf4ec2bf18f297b0dc850513d2096377636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.077579 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.090077 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.103808 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.117600 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.117860 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.117885 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.117895 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.117911 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.117921 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:29Z","lastTransitionTime":"2025-11-29T14:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.135769 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.149250 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.164093 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.175938 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.185814 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.193790 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.204841 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.220965 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.220988 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.220997 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.221014 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.221024 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:29Z","lastTransitionTime":"2025-11-29T14:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.221105 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:08Z\\\",\\\"message\\\":\\\"troller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1129 14:29:08.497770 6545 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1129 14:29:08.497797 6545 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF1129 14:29:08.497816 6545 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:29:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.235620 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.324075 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.324349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.324362 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.324380 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.324390 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:29Z","lastTransitionTime":"2025-11-29T14:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.427285 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.427324 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.427332 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.427352 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.427360 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:29Z","lastTransitionTime":"2025-11-29T14:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.479285 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.479320 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.479419 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:29 crc kubenswrapper[4907]: E1129 14:29:29.479641 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.479709 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:29 crc kubenswrapper[4907]: E1129 14:29:29.479840 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:29 crc kubenswrapper[4907]: E1129 14:29:29.479943 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:29 crc kubenswrapper[4907]: E1129 14:29:29.480294 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.495911 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.530488 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.530553 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.530574 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.530603 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.530627 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:29Z","lastTransitionTime":"2025-11-29T14:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.633318 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.633627 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.633646 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.633670 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.633688 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:29Z","lastTransitionTime":"2025-11-29T14:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.736593 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.736647 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.736665 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.736687 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.736705 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:29Z","lastTransitionTime":"2025-11-29T14:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.839181 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.839219 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.839237 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.839260 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.839275 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:29Z","lastTransitionTime":"2025-11-29T14:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.942021 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.942072 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.942090 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.942114 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.942131 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:29Z","lastTransitionTime":"2025-11-29T14:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.969948 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d5zvb_3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4/kube-multus/0.log" Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.970186 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5zvb" event={"ID":"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4","Type":"ContainerStarted","Data":"6d855997e199e8c32067f8e32d958526cdb8a19406794035937f3e7f77cb9bc8"} Nov 29 14:29:29 crc kubenswrapper[4907]: I1129 14:29:29.993942 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:29Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.013818 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.038502 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.044952 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.045057 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.045120 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.045188 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.045245 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:30Z","lastTransitionTime":"2025-11-29T14:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.057937 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.074059 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.087944 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.102942 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.119327 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.139971 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.148634 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.148719 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.148743 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.148780 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.148806 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:30Z","lastTransitionTime":"2025-11-29T14:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.163082 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:08Z\\\",\\\"message\\\":\\\"troller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1129 14:29:08.497770 6545 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1129 14:29:08.497797 6545 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF1129 14:29:08.497816 6545 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:29:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.177776 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.192786 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec13974a-428b-4841-9234-f3f70b6f2857\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c842d45e7b04ef536026a952134478e9f8aba8dc779b6bc127d2fc89063af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1818b34bee237f8b9788cae86c3541ecb29f693da7f3008bda027c4fe45618db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd8e9e3c38d0d0710dd8297cc120bf4ec2bf18f297b0dc850513d2096377636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.206252 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b9f3229-7103-4330-a56a-7eee8a8d12e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c722630ed7d79458dc8b77d6193c617baa5e6778268c59b056a310447612d3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22fe79452df710a2033161adf60fa9f0e00772f062db76a4f55e26aef9880345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fe79452df710a2033161adf60fa9f0e00772f062db76a4f55e26aef9880345\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.236964 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.251846 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.252130 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.252308 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.252491 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.252640 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:30Z","lastTransitionTime":"2025-11-29T14:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.257713 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.277971 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.298010 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.315155 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.334105 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d855997e199e8c32067f8e32d958526cdb8a19406794035937f3e7f77cb9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:28Z\\\",\\\"message\\\":\\\"2025-11-29T14:28:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c3c22201-d9b8-486a-9acf-51a1ade5667e\\\\n2025-11-29T14:28:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c3c22201-d9b8-486a-9acf-51a1ade5667e to /host/opt/cni/bin/\\\\n2025-11-29T14:28:43Z [verbose] multus-daemon started\\\\n2025-11-29T14:28:43Z [verbose] Readiness Indicator file check\\\\n2025-11-29T14:29:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:30Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.355724 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.355929 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.356065 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.356215 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.356301 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:30Z","lastTransitionTime":"2025-11-29T14:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.459070 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.459153 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.459180 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.459211 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.459233 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:30Z","lastTransitionTime":"2025-11-29T14:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.562365 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.562760 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.562961 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.563106 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.563247 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:30Z","lastTransitionTime":"2025-11-29T14:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.666384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.666469 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.666483 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.666506 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.666520 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:30Z","lastTransitionTime":"2025-11-29T14:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.769922 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.769981 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.769999 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.770026 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.770045 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:30Z","lastTransitionTime":"2025-11-29T14:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.872764 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.872807 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.872824 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.872843 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.872858 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:30Z","lastTransitionTime":"2025-11-29T14:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.975191 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.975261 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.975286 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.975316 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:30 crc kubenswrapper[4907]: I1129 14:29:30.975338 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:30Z","lastTransitionTime":"2025-11-29T14:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.078827 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.078904 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.078928 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.078962 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.078986 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:31Z","lastTransitionTime":"2025-11-29T14:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.182236 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.182337 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.182357 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.182419 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.182465 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:31Z","lastTransitionTime":"2025-11-29T14:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.285757 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.285815 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.285837 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.285861 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.285880 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:31Z","lastTransitionTime":"2025-11-29T14:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.389017 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.389061 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.389071 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.389090 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.389103 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:31Z","lastTransitionTime":"2025-11-29T14:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.479233 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.479272 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.479239 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:31 crc kubenswrapper[4907]: E1129 14:29:31.479370 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:31 crc kubenswrapper[4907]: E1129 14:29:31.479875 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:31 crc kubenswrapper[4907]: E1129 14:29:31.479983 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.480241 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:31 crc kubenswrapper[4907]: E1129 14:29:31.480363 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.491594 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.491656 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.491675 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.492093 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.492149 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:31Z","lastTransitionTime":"2025-11-29T14:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.594975 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.595029 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.595046 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.595069 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.595086 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:31Z","lastTransitionTime":"2025-11-29T14:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.697183 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.697249 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.697264 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.697282 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.697294 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:31Z","lastTransitionTime":"2025-11-29T14:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.800317 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.800372 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.800385 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.800405 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.800421 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:31Z","lastTransitionTime":"2025-11-29T14:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.902910 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.902962 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.902977 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.902995 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:31 crc kubenswrapper[4907]: I1129 14:29:31.903009 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:31Z","lastTransitionTime":"2025-11-29T14:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.005323 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.005373 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.005384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.005401 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.005413 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:32Z","lastTransitionTime":"2025-11-29T14:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.109021 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.109059 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.109119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.109139 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.109152 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:32Z","lastTransitionTime":"2025-11-29T14:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.212761 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.212794 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.212804 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.212826 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.212837 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:32Z","lastTransitionTime":"2025-11-29T14:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.315847 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.315892 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.315900 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.315913 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.315923 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:32Z","lastTransitionTime":"2025-11-29T14:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.419707 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.419745 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.419755 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.419770 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.419782 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:32Z","lastTransitionTime":"2025-11-29T14:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.498558 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.511628 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.521956 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.521992 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.522000 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.522014 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.522024 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:32Z","lastTransitionTime":"2025-11-29T14:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.524326 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.535885 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.549141 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.563332 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.593740 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:08Z\\\",\\\"message\\\":\\\"troller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1129 14:29:08.497770 6545 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1129 14:29:08.497797 6545 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF1129 14:29:08.497816 6545 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:29:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.614518 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.631603 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d855997e199e8c32067f8e32d958526cdb8a19406794035937f3e7f77cb9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:28Z\\\",\\\"message\\\":\\\"2025-11-29T14:28:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c3c22201-d9b8-486a-9acf-51a1ade5667e\\\\n2025-11-29T14:28:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c3c22201-d9b8-486a-9acf-51a1ade5667e to /host/opt/cni/bin/\\\\n2025-11-29T14:28:43Z [verbose] multus-daemon started\\\\n2025-11-29T14:28:43Z [verbose] Readiness Indicator file check\\\\n2025-11-29T14:29:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.631715 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.631744 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.631753 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.631771 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.631780 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:32Z","lastTransitionTime":"2025-11-29T14:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.648099 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec13974a-428b-4841-9234-f3f70b6f2857\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c842d45e7b04ef536026a952134478e9f8aba8dc779b6bc127d2fc89063af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1818b34bee237f8b9788cae86c3541ecb29f693da7f3008bda027c4fe45618db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd8e9e3c38d0d0710dd8297cc120bf4ec2bf18f297b0dc850513d2096377636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.662906 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b9f3229-7103-4330-a56a-7eee8a8d12e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c722630ed7d79458dc8b77d6193c617baa5e6778268c59b056a310447612d3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22fe79452df710a2033161adf60fa9f0e00772f062db76a4f55e26aef9880345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fe79452df710a2033161adf60fa9f0e00772f062db76a4f55e26aef9880345\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.694681 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.712205 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.726128 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.734305 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.734350 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.734369 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.734393 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.734412 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:32Z","lastTransitionTime":"2025-11-29T14:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.739309 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.757296 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.770686 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.784337 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.801907 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:32Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.837538 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.837579 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.837620 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.837636 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.837648 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:32Z","lastTransitionTime":"2025-11-29T14:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.940045 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.940101 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.940118 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.940142 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:32 crc kubenswrapper[4907]: I1129 14:29:32.940159 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:32Z","lastTransitionTime":"2025-11-29T14:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.043422 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.043513 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.043536 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.043564 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.043582 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:33Z","lastTransitionTime":"2025-11-29T14:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.146401 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.146504 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.146523 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.146553 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.146577 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:33Z","lastTransitionTime":"2025-11-29T14:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.249355 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.249410 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.249420 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.249466 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.249478 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:33Z","lastTransitionTime":"2025-11-29T14:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.352507 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.352598 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.352628 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.352664 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.352688 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:33Z","lastTransitionTime":"2025-11-29T14:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.456219 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.456292 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.456315 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.456342 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.456364 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:33Z","lastTransitionTime":"2025-11-29T14:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.478949 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.479026 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.478991 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.479030 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:33 crc kubenswrapper[4907]: E1129 14:29:33.479173 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:33 crc kubenswrapper[4907]: E1129 14:29:33.479293 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:33 crc kubenswrapper[4907]: E1129 14:29:33.479567 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:33 crc kubenswrapper[4907]: E1129 14:29:33.479678 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.559431 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.559512 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.559524 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.559549 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.559563 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:33Z","lastTransitionTime":"2025-11-29T14:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.662184 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.662237 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.662253 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.662275 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.662292 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:33Z","lastTransitionTime":"2025-11-29T14:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.765402 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.765484 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.765503 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.765533 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.765550 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:33Z","lastTransitionTime":"2025-11-29T14:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.868051 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.868130 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.868149 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.868177 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.868200 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:33Z","lastTransitionTime":"2025-11-29T14:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.971046 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.971116 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.971139 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.971167 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:33 crc kubenswrapper[4907]: I1129 14:29:33.971185 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:33Z","lastTransitionTime":"2025-11-29T14:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.073804 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.073870 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.073890 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.073917 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.073937 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:34Z","lastTransitionTime":"2025-11-29T14:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.176586 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.176652 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.176672 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.176702 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.176721 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:34Z","lastTransitionTime":"2025-11-29T14:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.279351 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.279398 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.279410 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.279452 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.279467 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:34Z","lastTransitionTime":"2025-11-29T14:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.382879 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.382927 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.382944 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.382969 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.382990 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:34Z","lastTransitionTime":"2025-11-29T14:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.485519 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.485556 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.485564 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.485577 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.485586 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:34Z","lastTransitionTime":"2025-11-29T14:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.588676 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.588742 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.588762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.588787 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.588805 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:34Z","lastTransitionTime":"2025-11-29T14:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.691976 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.692045 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.692066 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.692097 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.692120 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:34Z","lastTransitionTime":"2025-11-29T14:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.795548 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.795612 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.795630 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.795654 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.795674 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:34Z","lastTransitionTime":"2025-11-29T14:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.898107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.898149 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.898161 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.898202 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:34 crc kubenswrapper[4907]: I1129 14:29:34.898230 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:34Z","lastTransitionTime":"2025-11-29T14:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.000744 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.000820 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.000844 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.000874 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.000896 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:35Z","lastTransitionTime":"2025-11-29T14:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.104184 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.104236 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.104254 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.104276 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.104292 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:35Z","lastTransitionTime":"2025-11-29T14:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.206910 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.207001 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.207019 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.207043 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.207059 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:35Z","lastTransitionTime":"2025-11-29T14:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.309933 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.309992 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.310011 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.310036 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.310056 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:35Z","lastTransitionTime":"2025-11-29T14:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.412822 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.412882 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.412899 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.412922 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.412940 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:35Z","lastTransitionTime":"2025-11-29T14:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.478713 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.478793 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.478790 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.478738 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:35 crc kubenswrapper[4907]: E1129 14:29:35.478928 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:35 crc kubenswrapper[4907]: E1129 14:29:35.479126 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:35 crc kubenswrapper[4907]: E1129 14:29:35.479278 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:35 crc kubenswrapper[4907]: E1129 14:29:35.479393 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.515746 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.515813 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.515834 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.515859 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.515877 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:35Z","lastTransitionTime":"2025-11-29T14:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.618937 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.619014 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.619033 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.619064 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.619084 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:35Z","lastTransitionTime":"2025-11-29T14:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.672200 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.672255 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.672273 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.672298 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.672320 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:35Z","lastTransitionTime":"2025-11-29T14:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:35 crc kubenswrapper[4907]: E1129 14:29:35.689228 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:35Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.694090 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.694143 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.694161 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.694186 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.694206 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:35Z","lastTransitionTime":"2025-11-29T14:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:35 crc kubenswrapper[4907]: E1129 14:29:35.706950 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:35Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.712294 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.712350 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.712372 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.712406 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.712431 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:35Z","lastTransitionTime":"2025-11-29T14:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:35 crc kubenswrapper[4907]: E1129 14:29:35.729892 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:35Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.735698 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.735762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.735789 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.735819 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.735842 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:35Z","lastTransitionTime":"2025-11-29T14:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:35 crc kubenswrapper[4907]: E1129 14:29:35.754928 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:35Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.760159 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.760222 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.760245 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.760274 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.760296 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:35Z","lastTransitionTime":"2025-11-29T14:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:35 crc kubenswrapper[4907]: E1129 14:29:35.773487 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:35Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:35 crc kubenswrapper[4907]: E1129 14:29:35.773764 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.775846 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.775898 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.775915 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.775941 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.775958 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:35Z","lastTransitionTime":"2025-11-29T14:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.879344 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.879420 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.879472 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.879507 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.879529 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:35Z","lastTransitionTime":"2025-11-29T14:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.982255 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.982302 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.982319 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.982339 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:35 crc kubenswrapper[4907]: I1129 14:29:35.982355 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:35Z","lastTransitionTime":"2025-11-29T14:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.084943 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.084996 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.085017 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.085039 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.085056 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:36Z","lastTransitionTime":"2025-11-29T14:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.188075 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.188118 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.188134 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.188155 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.188171 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:36Z","lastTransitionTime":"2025-11-29T14:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.290466 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.290513 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.290529 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.290551 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.290567 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:36Z","lastTransitionTime":"2025-11-29T14:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.393391 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.393640 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.393675 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.393704 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.393725 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:36Z","lastTransitionTime":"2025-11-29T14:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.496466 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.496606 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.496638 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.496664 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.496686 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:36Z","lastTransitionTime":"2025-11-29T14:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.599775 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.599830 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.599852 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.599877 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.599899 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:36Z","lastTransitionTime":"2025-11-29T14:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.703045 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.703102 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.703119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.703143 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.703162 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:36Z","lastTransitionTime":"2025-11-29T14:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.806184 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.806248 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.806265 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.806289 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.806310 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:36Z","lastTransitionTime":"2025-11-29T14:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.910151 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.910212 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.910230 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.910256 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:36 crc kubenswrapper[4907]: I1129 14:29:36.910276 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:36Z","lastTransitionTime":"2025-11-29T14:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.013034 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.013090 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.013107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.013132 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.013150 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:37Z","lastTransitionTime":"2025-11-29T14:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.116475 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.116543 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.116561 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.116587 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.116604 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:37Z","lastTransitionTime":"2025-11-29T14:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.219262 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.219306 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.219319 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.219335 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.219346 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:37Z","lastTransitionTime":"2025-11-29T14:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.321516 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.321579 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.321596 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.321621 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.321644 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:37Z","lastTransitionTime":"2025-11-29T14:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.424124 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.424185 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.424204 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.424229 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.424248 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:37Z","lastTransitionTime":"2025-11-29T14:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.479379 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.479418 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.479512 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.479419 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:37 crc kubenswrapper[4907]: E1129 14:29:37.479838 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:37 crc kubenswrapper[4907]: E1129 14:29:37.479991 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:37 crc kubenswrapper[4907]: E1129 14:29:37.480122 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:37 crc kubenswrapper[4907]: E1129 14:29:37.480751 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.481517 4907 scope.go:117] "RemoveContainer" containerID="06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.526590 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.526646 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.526663 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.526689 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.526707 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:37Z","lastTransitionTime":"2025-11-29T14:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.629534 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.629584 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.629599 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.629617 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.629630 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:37Z","lastTransitionTime":"2025-11-29T14:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.733097 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.733159 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.733176 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.733201 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.733218 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:37Z","lastTransitionTime":"2025-11-29T14:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.836085 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.836164 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.836182 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.836208 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.836267 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:37Z","lastTransitionTime":"2025-11-29T14:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.938971 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.939074 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.939093 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.939120 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:37 crc kubenswrapper[4907]: I1129 14:29:37.939139 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:37Z","lastTransitionTime":"2025-11-29T14:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.041145 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.041202 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.041223 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.041250 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.041271 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:38Z","lastTransitionTime":"2025-11-29T14:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.143808 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.143868 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.143885 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.143912 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.143936 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:38Z","lastTransitionTime":"2025-11-29T14:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.247421 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.247545 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.247571 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.247603 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.247632 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:38Z","lastTransitionTime":"2025-11-29T14:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.350750 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.350810 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.350829 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.350857 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.350878 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:38Z","lastTransitionTime":"2025-11-29T14:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.454337 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.454404 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.454423 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.454481 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.454500 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:38Z","lastTransitionTime":"2025-11-29T14:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.557205 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.557256 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.557269 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.557291 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.557305 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:38Z","lastTransitionTime":"2025-11-29T14:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.661597 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.661659 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.661675 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.661701 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.661716 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:38Z","lastTransitionTime":"2025-11-29T14:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.765882 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.765968 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.765987 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.766015 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.766033 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:38Z","lastTransitionTime":"2025-11-29T14:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.869479 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.869541 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.869559 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.869587 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.869611 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:38Z","lastTransitionTime":"2025-11-29T14:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.973200 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.973256 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.973273 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.973299 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:38 crc kubenswrapper[4907]: I1129 14:29:38.973317 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:38Z","lastTransitionTime":"2025-11-29T14:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.013229 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovnkube-controller/2.log" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.017956 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerStarted","Data":"34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92"} Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.018948 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.039938 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.058622 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.076565 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.076775 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.077048 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.077285 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.077521 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:39Z","lastTransitionTime":"2025-11-29T14:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.079666 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d855997e199e8c32067f8e32d958526cdb8a19406794035937f3e7f77cb9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:28Z\\\",\\\"message\\\":\\\"2025-11-29T14:28:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c3c22201-d9b8-486a-9acf-51a1ade5667e\\\\n2025-11-29T14:28:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c3c22201-d9b8-486a-9acf-51a1ade5667e to /host/opt/cni/bin/\\\\n2025-11-29T14:28:43Z [verbose] multus-daemon started\\\\n2025-11-29T14:28:43Z [verbose] Readiness Indicator file check\\\\n2025-11-29T14:29:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.097137 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec13974a-428b-4841-9234-f3f70b6f2857\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c842d45e7b04ef536026a952134478e9f8aba8dc779b6bc127d2fc89063af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1818b34bee237f8b9788cae86c3541ecb29f693da7f3008bda027c4fe45618db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd8e9e3c38d0d0710dd8297cc120bf4ec2bf18f297b0dc850513d2096377636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.111650 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b9f3229-7103-4330-a56a-7eee8a8d12e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c722630ed7d79458dc8b77d6193c617baa5e6778268c59b056a310447612d3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22fe79452df710a2033161adf60fa9f0e00772f062db76a4f55e26aef9880345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fe79452df710a2033161adf60fa9f0e00772f062db76a4f55e26aef9880345\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.144552 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.168545 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.180815 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.180887 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.180905 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.180931 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.180948 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:39Z","lastTransitionTime":"2025-11-29T14:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.189548 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.208509 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.225540 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.248234 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.267774 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.284216 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.284271 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.284293 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.284326 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.284349 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:39Z","lastTransitionTime":"2025-11-29T14:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.285579 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.302719 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.320781 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.338964 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.359497 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.387838 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.387906 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.387924 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.387950 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.387969 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:39Z","lastTransitionTime":"2025-11-29T14:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.393033 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:08Z\\\",\\\"message\\\":\\\"troller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1129 14:29:08.497770 6545 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1129 14:29:08.497797 6545 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF1129 14:29:08.497816 6545 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:29:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.409476 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:39Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.479491 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.479540 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.479519 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:39 crc kubenswrapper[4907]: E1129 14:29:39.479689 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:39 crc kubenswrapper[4907]: E1129 14:29:39.479807 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:39 crc kubenswrapper[4907]: E1129 14:29:39.479892 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.480158 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:39 crc kubenswrapper[4907]: E1129 14:29:39.480403 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.490818 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.490865 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.490883 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.490906 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.490924 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:39Z","lastTransitionTime":"2025-11-29T14:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.593299 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.593366 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.593385 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.593411 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.593428 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:39Z","lastTransitionTime":"2025-11-29T14:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.696234 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.696551 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.696740 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.697378 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.697931 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:39Z","lastTransitionTime":"2025-11-29T14:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.800550 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.800618 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.800635 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.800661 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.800680 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:39Z","lastTransitionTime":"2025-11-29T14:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.904039 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.904108 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.904131 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.904164 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:39 crc kubenswrapper[4907]: I1129 14:29:39.904186 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:39Z","lastTransitionTime":"2025-11-29T14:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.007710 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.007767 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.007784 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.007812 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.007831 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:40Z","lastTransitionTime":"2025-11-29T14:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.024294 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovnkube-controller/3.log" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.025322 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovnkube-controller/2.log" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.029843 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5339013-9544-4e7e-a449-c257f1086638" containerID="34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92" exitCode=1 Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.030026 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerDied","Data":"34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92"} Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.030086 4907 scope.go:117] "RemoveContainer" containerID="06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.030888 4907 scope.go:117] "RemoveContainer" containerID="34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92" Nov 29 14:29:40 crc kubenswrapper[4907]: E1129 14:29:40.031141 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.054735 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.072652 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.094832 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.110888 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.110953 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.110977 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.111013 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.111040 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:40Z","lastTransitionTime":"2025-11-29T14:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.114626 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.134119 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.149921 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.166088 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.183160 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.206155 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.215984 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.216048 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.216070 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.216101 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.216124 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:40Z","lastTransitionTime":"2025-11-29T14:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.240156 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06aac349986d3db78ed5adbe7c946837470c2d5a441a4ab3e630c62bf6f1ed41\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:08Z\\\",\\\"message\\\":\\\"troller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1129 14:29:08.497770 6545 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1129 14:29:08.497797 6545 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF1129 14:29:08.497816 6545 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:29:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:39Z\\\",\\\"message\\\":\\\" for network=default: \\\\u0026Service{ObjectMeta:{console openshift-console da99054c-338b-4216-8e73-72be1a1258a4 12350 0 2025-02-23 05:39:23 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:console] map[operator.openshift.io/spec-hash:5a95972a23c40ab49ce88af0712f389072cea6a9798f6e5350b856d92bc3bd6d service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:console-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: console,component: ui,},ClusterIP:10.217.5.194,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.194],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1129 14:29:38.916116 6902 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to com\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.258162 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.275724 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec13974a-428b-4841-9234-f3f70b6f2857\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c842d45e7b04ef536026a952134478e9f8aba8dc779b6bc127d2fc89063af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1818b34bee237f8b9788cae86c3541ecb29f693da7f3008bda027c4fe45618db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd8e9e3c38d0d0710dd8297cc120bf4ec2bf18f297b0dc850513d2096377636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.290133 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b9f3229-7103-4330-a56a-7eee8a8d12e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c722630ed7d79458dc8b77d6193c617baa5e6778268c59b056a310447612d3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22fe79452df710a2033161adf60fa9f0e00772f062db76a4f55e26aef9880345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fe79452df710a2033161adf60fa9f0e00772f062db76a4f55e26aef9880345\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.320714 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.321256 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.321596 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.321760 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.321918 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:40Z","lastTransitionTime":"2025-11-29T14:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.324171 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.348112 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.367666 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.386489 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.404549 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.424828 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.424890 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.424902 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.424919 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.425328 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:40Z","lastTransitionTime":"2025-11-29T14:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.425767 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d855997e199e8c32067f8e32d958526cdb8a19406794035937f3e7f77cb9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:28Z\\\",\\\"message\\\":\\\"2025-11-29T14:28:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c3c22201-d9b8-486a-9acf-51a1ade5667e\\\\n2025-11-29T14:28:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c3c22201-d9b8-486a-9acf-51a1ade5667e to /host/opt/cni/bin/\\\\n2025-11-29T14:28:43Z [verbose] multus-daemon started\\\\n2025-11-29T14:28:43Z [verbose] Readiness Indicator file check\\\\n2025-11-29T14:29:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:40Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.528868 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.528935 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.528951 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.528974 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.528990 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:40Z","lastTransitionTime":"2025-11-29T14:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.632013 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.632092 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.632114 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.632145 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.632168 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:40Z","lastTransitionTime":"2025-11-29T14:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.735532 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.735605 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.735623 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.735652 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.735672 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:40Z","lastTransitionTime":"2025-11-29T14:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.839589 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.839668 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.839686 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.839718 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.839740 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:40Z","lastTransitionTime":"2025-11-29T14:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.943651 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.943731 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.943776 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.943807 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:40 crc kubenswrapper[4907]: I1129 14:29:40.943828 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:40Z","lastTransitionTime":"2025-11-29T14:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.038865 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovnkube-controller/3.log" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.045642 4907 scope.go:117] "RemoveContainer" containerID="34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92" Nov 29 14:29:41 crc kubenswrapper[4907]: E1129 14:29:41.046115 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.046558 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.046606 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.046628 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.046655 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.046676 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:41Z","lastTransitionTime":"2025-11-29T14:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.070900 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.103774 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:39Z\\\",\\\"message\\\":\\\" for network=default: \\\\u0026Service{ObjectMeta:{console openshift-console da99054c-338b-4216-8e73-72be1a1258a4 12350 0 2025-02-23 05:39:23 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:console] map[operator.openshift.io/spec-hash:5a95972a23c40ab49ce88af0712f389072cea6a9798f6e5350b856d92bc3bd6d service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:console-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: console,component: ui,},ClusterIP:10.217.5.194,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.194],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1129 14:29:38.916116 6902 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to com\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.123862 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.146621 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d855997e199e8c32067f8e32d958526cdb8a19406794035937f3e7f77cb9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:28Z\\\",\\\"message\\\":\\\"2025-11-29T14:28:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c3c22201-d9b8-486a-9acf-51a1ade5667e\\\\n2025-11-29T14:28:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c3c22201-d9b8-486a-9acf-51a1ade5667e to /host/opt/cni/bin/\\\\n2025-11-29T14:28:43Z [verbose] multus-daemon started\\\\n2025-11-29T14:28:43Z [verbose] Readiness Indicator file check\\\\n2025-11-29T14:29:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.149871 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.149929 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.149950 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.149977 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.149995 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:41Z","lastTransitionTime":"2025-11-29T14:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.168551 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec13974a-428b-4841-9234-f3f70b6f2857\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c842d45e7b04ef536026a952134478e9f8aba8dc779b6bc127d2fc89063af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1818b34bee237f8b9788cae86c3541ecb29f693da7f3008bda027c4fe45618db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd8e9e3c38d0d0710dd8297cc120bf4ec2bf18f297b0dc850513d2096377636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.183684 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b9f3229-7103-4330-a56a-7eee8a8d12e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c722630ed7d79458dc8b77d6193c617baa5e6778268c59b056a310447612d3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22fe79452df710a2033161adf60fa9f0e00772f062db76a4f55e26aef9880345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fe79452df710a2033161adf60fa9f0e00772f062db76a4f55e26aef9880345\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.226247 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.249972 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.255345 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.255487 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.255516 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.255546 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.255565 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:41Z","lastTransitionTime":"2025-11-29T14:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.275413 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.296879 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.316743 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.338288 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.359522 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.359571 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.359591 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.359618 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.359638 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:41Z","lastTransitionTime":"2025-11-29T14:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.359743 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.390414 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.413549 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.435374 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.456263 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.462807 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.462865 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.462887 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.462919 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.462942 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:41Z","lastTransitionTime":"2025-11-29T14:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.472643 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.478599 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.478670 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.478693 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.478631 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:41 crc kubenswrapper[4907]: E1129 14:29:41.478802 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:41 crc kubenswrapper[4907]: E1129 14:29:41.478923 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:41 crc kubenswrapper[4907]: E1129 14:29:41.479021 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:41 crc kubenswrapper[4907]: E1129 14:29:41.479110 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.491760 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:41Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.565342 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.565396 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.565414 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.565471 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.565489 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:41Z","lastTransitionTime":"2025-11-29T14:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.669047 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.669149 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.669168 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.669201 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.669227 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:41Z","lastTransitionTime":"2025-11-29T14:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.776167 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.776246 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.776263 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.776290 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.776308 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:41Z","lastTransitionTime":"2025-11-29T14:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.879513 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.879597 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.879623 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.879661 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.879687 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:41Z","lastTransitionTime":"2025-11-29T14:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.983003 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.983079 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.983131 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.983165 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:41 crc kubenswrapper[4907]: I1129 14:29:41.983189 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:41Z","lastTransitionTime":"2025-11-29T14:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.085772 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.085842 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.085862 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.085889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.085907 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:42Z","lastTransitionTime":"2025-11-29T14:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.189199 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.189283 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.189302 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.189336 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.189357 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:42Z","lastTransitionTime":"2025-11-29T14:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.292542 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.292618 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.292636 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.292662 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.292682 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:42Z","lastTransitionTime":"2025-11-29T14:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.395839 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.395898 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.395916 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.395939 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.395957 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:42Z","lastTransitionTime":"2025-11-29T14:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.499729 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.499783 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.499829 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.499856 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.499873 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:42Z","lastTransitionTime":"2025-11-29T14:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.501884 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98218f13-4c92-4d49-91fa-397cfd5a5d53\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa5cd3f99ca816e803bcbab1eec5e26a0ee0e3e6c2bb3680bddca8715ed4df4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c85cd57a0906727318afd99af6a3f17b96d4002f5a4f8d570acc371d0b29549f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0d26aaf712be9a9455215138bcddeb2bd6e12d2c161a19a614c03fd50be3cd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.536341 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5339013-9544-4e7e-a449-c257f1086638\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:39Z\\\",\\\"message\\\":\\\" for network=default: \\\\u0026Service{ObjectMeta:{console openshift-console da99054c-338b-4216-8e73-72be1a1258a4 12350 0 2025-02-23 05:39:23 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:console] map[operator.openshift.io/spec-hash:5a95972a23c40ab49ce88af0712f389072cea6a9798f6e5350b856d92bc3bd6d service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:console-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: console,component: ui,},ClusterIP:10.217.5.194,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.194],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1129 14:29:38.916116 6902 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to com\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grsbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dtnl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.555551 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25ct5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f50e55a-d427-4cde-a639-d6c7597e937a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wt9sw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25ct5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.579318 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebc5706d0ecdcf2912d3ce428907683e23cbeb9463f3e1dfcde9c8fbbcbb0f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d0560e5fed2e194222e5d803cd67d190174a3848da5642830e17d2c839b9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.598932 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.604073 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.604130 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.604153 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.604186 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.604210 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:42Z","lastTransitionTime":"2025-11-29T14:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.620501 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d5zvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d855997e199e8c32067f8e32d958526cdb8a19406794035937f3e7f77cb9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T14:29:28Z\\\",\\\"message\\\":\\\"2025-11-29T14:28:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c3c22201-d9b8-486a-9acf-51a1ade5667e\\\\n2025-11-29T14:28:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c3c22201-d9b8-486a-9acf-51a1ade5667e to /host/opt/cni/bin/\\\\n2025-11-29T14:28:43Z [verbose] multus-daemon started\\\\n2025-11-29T14:28:43Z [verbose] Readiness Indicator file check\\\\n2025-11-29T14:29:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sj2tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d5zvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.640148 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec13974a-428b-4841-9234-f3f70b6f2857\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c842d45e7b04ef536026a952134478e9f8aba8dc779b6bc127d2fc89063af4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1818b34bee237f8b9788cae86c3541ecb29f693da7f3008bda027c4fe45618db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://afd8e9e3c38d0d0710dd8297cc120bf4ec2bf18f297b0dc850513d2096377636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3d5b27c2288e7b39e018067e45a474e05fd572f8bcf72cd43be3af501d6b613\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.658518 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b9f3229-7103-4330-a56a-7eee8a8d12e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c722630ed7d79458dc8b77d6193c617baa5e6778268c59b056a310447612d3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22fe79452df710a2033161adf60fa9f0e00772f062db76a4f55e26aef9880345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fe79452df710a2033161adf60fa9f0e00772f062db76a4f55e26aef9880345\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.694424 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288a8407-fb76-4d2f-9b40-c16545397f0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f769658c36c61960951b83f292a59b9c66b80b2895a207899f4fdb1d2890d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5ebca4c5dc03c8bce8b6583c0903e2df1c9f0760079b0955c8d13aadcfa397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5262920a57f3957c38312782c14fb41bef91b7d72b7c0f4c81fdeaef74ae4098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8fb8c453d5e4c36de9a168bfcc26685428032b67a400057be5059a74d84ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db702d71d3abc1a30999bfdb27bd3c3d5e0f64e5fdcf9039d9313c0eddb2d3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d6bf9161124ce0d4750657443251431ce46de7e82c21a84c21c9737c80020\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bb354ab8c86d02f55fd6ec4357b6882b9241ac79814cacd7ca42924a15a23df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fce098708ec17c65531b02fa591d5a2ec9051c4ee28e6acaccc5f35383d75ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.707789 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.707848 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.707868 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.707895 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.707914 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:42Z","lastTransitionTime":"2025-11-29T14:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.719103 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76002832-0954-42e0-85c2-fec6eef37411\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T14:28:40Z\\\",\\\"message\\\":\\\"40.694977 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764426504\\\\\\\\\\\\\\\" (2025-11-29 14:28:24 +0000 UTC to 2025-12-29 14:28:25 +0000 UTC (now=2025-11-29 14:28:40.694131608 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.695773 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1129 14:28:40.695788 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1129 14:28:40.696971 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764426515\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764426515\\\\\\\\\\\\\\\" (2025-11-29 13:28:35 +0000 UTC to 2026-11-29 13:28:35 +0000 UTC (now=2025-11-29 14:28:40.696922634 +0000 UTC))\\\\\\\"\\\\nI1129 14:28:40.697064 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1129 14:28:40.697123 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1129 14:28:40.697263 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1129 14:28:40.698242 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1129 14:28:40.698260 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1129 14:28:40.698277 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4193871863/tls.crt::/tmp/serving-cert-4193871863/tls.key\\\\\\\"\\\\nF1129 14:28:40.699839 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1129 14:28:40.706014 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.740699 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321f273d639585170afa20197ed8f22364fa6e713445e3a4d977c70005d3b178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.760918 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.779042 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58e4d8d7-8362-41f0-80eb-c07a9219ffbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aeec711dc2dc9cc529b131497c8cc6252ded5606af3f913080e8c29dc64cde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qg5ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t4jq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.797650 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pngnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27b9dc6c-d485-4b7b-94b1-e71337539997\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://050c9bd8676c1e50dd1f4d16249c786156d7e2c112e29ab8af5fafec535886a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://178aec8ae615c0f8428192b4163ef2f9eb306bddef2b1819780b2be0eafbae22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5adee0f37b3bd5f5e90ec9761cb6b9e458ffaabcd373d270edd3df19b9d5c6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c42a7ee0ca9c4ac780b093653577ab591e1d5ef2e0dbdf8c3a56b7955e411b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa3de035406830e734b458398e2313d5e020295458a1f9c99216198392ac4a9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36052e52931c736a1e865c75bcf8f610c6d153d958c74ffbb89cf1f2ca28c2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6819c9dda7982bfa7dfb1e6a993d75bf3c3f9f90ac9abc712b3cc873d87ecbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T14:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T14:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvvbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pngnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.811289 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.811362 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.811384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.811418 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.811475 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:42Z","lastTransitionTime":"2025-11-29T14:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.819918 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.840167 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c746c4307c204b642fc4ccbf256ddef0a4a304f6b1a0a261974f49be6ecc2b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.857642 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c92rh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff04d25-6931-42f8-af97-0f231dfb8d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6941f1f0f75017f32d8a2359145e9cb6fd2385aec0291acc9454e0de129b2797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hztt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c92rh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.874644 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vg6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0767139-51fc-4c53-aa4d-c52b815fcc81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400ae5ab7c3a6d4efce625c8f316769aa58fdfda431cc0faa165e0fdad6f8328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slcqw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vg6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.900912 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cad52eb-140b-46cc-bbe1-fdada0728e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T14:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54752c483834122253a862c0eccb5587e9549e62d09bfb04f9078969a1759231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11b55c3028b5a03b82c77911c2890b6d8aaa17dd653482f45ef3eb91b348b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T14:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T14:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2hghh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:42Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.915906 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.915983 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.916004 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.916033 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:42 crc kubenswrapper[4907]: I1129 14:29:42.916060 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:42Z","lastTransitionTime":"2025-11-29T14:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.020003 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.020750 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.020776 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.020804 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.020826 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:43Z","lastTransitionTime":"2025-11-29T14:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.124016 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.124084 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.124104 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.124133 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.124150 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:43Z","lastTransitionTime":"2025-11-29T14:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.227833 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.227892 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.227912 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.227937 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.227954 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:43Z","lastTransitionTime":"2025-11-29T14:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.332146 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.332198 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.332216 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.332238 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.332257 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:43Z","lastTransitionTime":"2025-11-29T14:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.435561 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.435618 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.435635 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.435661 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.435681 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:43Z","lastTransitionTime":"2025-11-29T14:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.479102 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.479178 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.479124 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.479124 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:43 crc kubenswrapper[4907]: E1129 14:29:43.479290 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:43 crc kubenswrapper[4907]: E1129 14:29:43.479402 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:43 crc kubenswrapper[4907]: E1129 14:29:43.479663 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:43 crc kubenswrapper[4907]: E1129 14:29:43.479800 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.538768 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.538869 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.538888 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.538963 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.538982 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:43Z","lastTransitionTime":"2025-11-29T14:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.641998 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.642095 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.642122 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.642155 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.642180 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:43Z","lastTransitionTime":"2025-11-29T14:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.745153 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.745198 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.745216 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.745237 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.745254 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:43Z","lastTransitionTime":"2025-11-29T14:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.847511 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.847555 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.847572 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.847591 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.847608 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:43Z","lastTransitionTime":"2025-11-29T14:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.950817 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.950883 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.950904 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.950927 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:43 crc kubenswrapper[4907]: I1129 14:29:43.950947 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:43Z","lastTransitionTime":"2025-11-29T14:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.053015 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.053063 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.053078 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.053094 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.053112 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:44Z","lastTransitionTime":"2025-11-29T14:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.156300 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.156360 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.156379 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.156402 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.156422 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:44Z","lastTransitionTime":"2025-11-29T14:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.259365 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.259426 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.259473 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.259498 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.259519 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:44Z","lastTransitionTime":"2025-11-29T14:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.362246 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.362308 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.362327 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.362356 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.362374 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:44Z","lastTransitionTime":"2025-11-29T14:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.465850 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.465937 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.465955 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.465982 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.466000 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:44Z","lastTransitionTime":"2025-11-29T14:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.569262 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.569295 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.569304 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.569317 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.569328 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:44Z","lastTransitionTime":"2025-11-29T14:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.672156 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.672227 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.672250 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.672274 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.672292 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:44Z","lastTransitionTime":"2025-11-29T14:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.775417 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.775505 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.775528 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.775559 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.775579 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:44Z","lastTransitionTime":"2025-11-29T14:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.879162 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.879224 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.879242 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.879267 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.879284 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:44Z","lastTransitionTime":"2025-11-29T14:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.981994 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.982037 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.982045 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.982060 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:44 crc kubenswrapper[4907]: I1129 14:29:44.982074 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:44Z","lastTransitionTime":"2025-11-29T14:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.084725 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.084767 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.084779 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.084796 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.084810 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:45Z","lastTransitionTime":"2025-11-29T14:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.186790 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.186848 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.186871 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.186898 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.186919 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:45Z","lastTransitionTime":"2025-11-29T14:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.289751 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.289806 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.289823 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.289847 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.289865 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:45Z","lastTransitionTime":"2025-11-29T14:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.393045 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.393088 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.393104 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.393126 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.393142 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:45Z","lastTransitionTime":"2025-11-29T14:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.478582 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.478653 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.478699 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.478858 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.478943 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.479100 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.479260 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.479374 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.495912 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.495950 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.495966 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.495986 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.496001 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:45Z","lastTransitionTime":"2025-11-29T14:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.563020 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.563168 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:49.563141147 +0000 UTC m=+147.549978829 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.563232 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.563305 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.563339 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.563375 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.563429 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.563546 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:30:49.563518357 +0000 UTC m=+147.550356049 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.563553 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.563562 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.563587 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.563599 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.563462 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.563618 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.563607 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.563675 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 14:30:49.563660691 +0000 UTC m=+147.550498383 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.563789 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 14:30:49.563764134 +0000 UTC m=+147.550601826 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 14:29:45 crc kubenswrapper[4907]: E1129 14:29:45.563816 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 14:30:49.563799075 +0000 UTC m=+147.550636767 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.598945 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.599069 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.599094 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.599122 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.599146 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:45Z","lastTransitionTime":"2025-11-29T14:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.701811 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.701873 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.701890 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.701915 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.701932 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:45Z","lastTransitionTime":"2025-11-29T14:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.804423 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.804506 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.804522 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.804545 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.804556 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:45Z","lastTransitionTime":"2025-11-29T14:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.908617 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.908689 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.908711 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.908741 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:45 crc kubenswrapper[4907]: I1129 14:29:45.908763 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:45Z","lastTransitionTime":"2025-11-29T14:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.011986 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.012060 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.012081 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.012111 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.012133 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:46Z","lastTransitionTime":"2025-11-29T14:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.115104 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.115183 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.115207 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.115234 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.115254 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:46Z","lastTransitionTime":"2025-11-29T14:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.145745 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.145794 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.145811 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.145832 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.145849 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:46Z","lastTransitionTime":"2025-11-29T14:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:46 crc kubenswrapper[4907]: E1129 14:29:46.167301 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.172058 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.172098 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.172109 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.172127 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.172138 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:46Z","lastTransitionTime":"2025-11-29T14:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:46 crc kubenswrapper[4907]: E1129 14:29:46.193080 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.197742 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.197807 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.197824 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.197849 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.197864 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:46Z","lastTransitionTime":"2025-11-29T14:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:46 crc kubenswrapper[4907]: E1129 14:29:46.216221 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.220285 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.220371 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.220391 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.220421 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.220476 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:46Z","lastTransitionTime":"2025-11-29T14:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:46 crc kubenswrapper[4907]: E1129 14:29:46.240508 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.245709 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.245763 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.245781 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.245800 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.245813 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:46Z","lastTransitionTime":"2025-11-29T14:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:46 crc kubenswrapper[4907]: E1129 14:29:46.264973 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T14:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78619aac-2f63-40e5-809f-f2f742346ccf\\\",\\\"systemUUID\\\":\\\"aa1144e5-f0f0-4c33-8960-c154529ab598\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T14:29:46Z is after 2025-08-24T17:21:41Z" Nov 29 14:29:46 crc kubenswrapper[4907]: E1129 14:29:46.265202 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.267377 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.267480 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.267510 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.267544 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.267572 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:46Z","lastTransitionTime":"2025-11-29T14:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.370814 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.370872 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.370896 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.370925 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.370950 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:46Z","lastTransitionTime":"2025-11-29T14:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.474286 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.474335 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.474343 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.474363 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.474372 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:46Z","lastTransitionTime":"2025-11-29T14:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.576932 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.577000 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.577024 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.577056 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.577082 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:46Z","lastTransitionTime":"2025-11-29T14:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.680415 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.680572 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.680590 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.680621 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.680639 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:46Z","lastTransitionTime":"2025-11-29T14:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.784326 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.784388 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.784403 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.784425 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.784465 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:46Z","lastTransitionTime":"2025-11-29T14:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.888245 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.888305 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.888325 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.888348 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.888366 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:46Z","lastTransitionTime":"2025-11-29T14:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.991722 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.991790 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.991807 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.991837 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:46 crc kubenswrapper[4907]: I1129 14:29:46.991858 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:46Z","lastTransitionTime":"2025-11-29T14:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.095419 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.095528 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.095546 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.095574 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.095591 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:47Z","lastTransitionTime":"2025-11-29T14:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.199329 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.199479 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.199506 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.199535 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.199552 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:47Z","lastTransitionTime":"2025-11-29T14:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.301879 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.301944 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.301961 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.301989 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.302006 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:47Z","lastTransitionTime":"2025-11-29T14:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.406032 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.406100 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.406117 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.406144 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.406165 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:47Z","lastTransitionTime":"2025-11-29T14:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.478843 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.478910 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.479000 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.478994 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:47 crc kubenswrapper[4907]: E1129 14:29:47.479223 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:47 crc kubenswrapper[4907]: E1129 14:29:47.479416 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:47 crc kubenswrapper[4907]: E1129 14:29:47.479571 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:47 crc kubenswrapper[4907]: E1129 14:29:47.479661 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.510572 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.510668 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.510688 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.510714 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.510732 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:47Z","lastTransitionTime":"2025-11-29T14:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.614195 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.614940 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.615063 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.615103 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.615127 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:47Z","lastTransitionTime":"2025-11-29T14:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.717963 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.718027 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.718046 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.718071 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.718089 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:47Z","lastTransitionTime":"2025-11-29T14:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.821513 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.821581 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.821599 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.821637 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.821657 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:47Z","lastTransitionTime":"2025-11-29T14:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.924956 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.925016 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.925034 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.925067 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:47 crc kubenswrapper[4907]: I1129 14:29:47.925085 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:47Z","lastTransitionTime":"2025-11-29T14:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.027726 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.027790 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.027812 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.027841 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.027862 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:48Z","lastTransitionTime":"2025-11-29T14:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.130107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.130152 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.130168 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.130189 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.130205 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:48Z","lastTransitionTime":"2025-11-29T14:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.232888 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.232941 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.232958 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.232982 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.232999 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:48Z","lastTransitionTime":"2025-11-29T14:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.335955 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.336014 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.336031 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.336054 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.336074 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:48Z","lastTransitionTime":"2025-11-29T14:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.440018 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.440092 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.440110 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.440141 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.440163 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:48Z","lastTransitionTime":"2025-11-29T14:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.543719 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.543790 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.543807 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.543833 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.543851 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:48Z","lastTransitionTime":"2025-11-29T14:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.646699 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.646781 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.646806 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.646836 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.646857 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:48Z","lastTransitionTime":"2025-11-29T14:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.749842 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.749901 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.749919 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.749945 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.749964 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:48Z","lastTransitionTime":"2025-11-29T14:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.853402 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.853505 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.853525 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.853557 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.853580 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:48Z","lastTransitionTime":"2025-11-29T14:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.957147 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.957211 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.957230 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.957259 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:48 crc kubenswrapper[4907]: I1129 14:29:48.957278 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:48Z","lastTransitionTime":"2025-11-29T14:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.060223 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.060303 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.060326 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.060357 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.060379 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:49Z","lastTransitionTime":"2025-11-29T14:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.163394 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.163510 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.163530 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.163556 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.163575 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:49Z","lastTransitionTime":"2025-11-29T14:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.266866 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.266930 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.266948 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.266975 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.266992 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:49Z","lastTransitionTime":"2025-11-29T14:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.370606 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.370704 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.370724 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.370767 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.370790 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:49Z","lastTransitionTime":"2025-11-29T14:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.474018 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.474122 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.474142 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.474174 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.474196 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:49Z","lastTransitionTime":"2025-11-29T14:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.479328 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.479423 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:49 crc kubenswrapper[4907]: E1129 14:29:49.479626 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.479678 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.479650 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:49 crc kubenswrapper[4907]: E1129 14:29:49.479833 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:49 crc kubenswrapper[4907]: E1129 14:29:49.480036 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:49 crc kubenswrapper[4907]: E1129 14:29:49.480190 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.577165 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.577326 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.577352 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.577381 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.577400 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:49Z","lastTransitionTime":"2025-11-29T14:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.680693 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.680765 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.680791 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.680822 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.680842 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:49Z","lastTransitionTime":"2025-11-29T14:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.784184 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.784285 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.784306 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.784331 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.784352 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:49Z","lastTransitionTime":"2025-11-29T14:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.886976 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.887055 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.887076 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.887108 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.887132 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:49Z","lastTransitionTime":"2025-11-29T14:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.990008 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.990069 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.990089 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.990118 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:49 crc kubenswrapper[4907]: I1129 14:29:49.990137 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:49Z","lastTransitionTime":"2025-11-29T14:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.093113 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.093187 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.093212 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.093242 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.093264 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:50Z","lastTransitionTime":"2025-11-29T14:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.196721 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.196811 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.196832 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.196867 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.196894 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:50Z","lastTransitionTime":"2025-11-29T14:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.300227 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.300303 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.300321 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.300350 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.300369 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:50Z","lastTransitionTime":"2025-11-29T14:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.404130 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.404206 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.404230 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.404267 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.404292 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:50Z","lastTransitionTime":"2025-11-29T14:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.507184 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.507244 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.507261 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.507284 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.507303 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:50Z","lastTransitionTime":"2025-11-29T14:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.609590 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.609644 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.609666 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.609686 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.609703 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:50Z","lastTransitionTime":"2025-11-29T14:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.712090 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.712169 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.712193 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.712225 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.712250 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:50Z","lastTransitionTime":"2025-11-29T14:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.815399 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.815503 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.815521 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.815550 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.815571 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:50Z","lastTransitionTime":"2025-11-29T14:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.918910 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.918986 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.919004 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.919030 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:50 crc kubenswrapper[4907]: I1129 14:29:50.919049 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:50Z","lastTransitionTime":"2025-11-29T14:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.021932 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.022023 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.022041 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.022101 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.022121 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:51Z","lastTransitionTime":"2025-11-29T14:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.126115 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.126175 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.126198 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.126224 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.126244 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:51Z","lastTransitionTime":"2025-11-29T14:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.228930 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.229010 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.229039 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.229072 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.229097 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:51Z","lastTransitionTime":"2025-11-29T14:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.331827 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.331884 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.331902 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.331929 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.331946 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:51Z","lastTransitionTime":"2025-11-29T14:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.434590 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.434650 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.434668 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.434692 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.434709 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:51Z","lastTransitionTime":"2025-11-29T14:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.479140 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.479212 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.479202 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.479148 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:51 crc kubenswrapper[4907]: E1129 14:29:51.479350 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:51 crc kubenswrapper[4907]: E1129 14:29:51.479576 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:51 crc kubenswrapper[4907]: E1129 14:29:51.479745 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:51 crc kubenswrapper[4907]: E1129 14:29:51.479947 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.537827 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.537882 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.537900 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.537925 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.537944 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:51Z","lastTransitionTime":"2025-11-29T14:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.641038 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.641106 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.641123 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.641149 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.641167 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:51Z","lastTransitionTime":"2025-11-29T14:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.744473 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.744539 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.744564 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.744594 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.744617 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:51Z","lastTransitionTime":"2025-11-29T14:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.847351 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.847428 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.847501 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.847533 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.847575 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:51Z","lastTransitionTime":"2025-11-29T14:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.950473 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.950544 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.950564 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.950589 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:51 crc kubenswrapper[4907]: I1129 14:29:51.950606 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:51Z","lastTransitionTime":"2025-11-29T14:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.053620 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.053685 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.053704 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.053731 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.053751 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:52Z","lastTransitionTime":"2025-11-29T14:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.157658 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.157719 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.157738 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.157766 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.157784 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:52Z","lastTransitionTime":"2025-11-29T14:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.260853 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.260939 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.260962 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.260995 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.261022 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:52Z","lastTransitionTime":"2025-11-29T14:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.364063 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.364164 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.364224 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.364255 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.364273 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:52Z","lastTransitionTime":"2025-11-29T14:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.467081 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.467219 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.467249 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.467281 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.467304 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:52Z","lastTransitionTime":"2025-11-29T14:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.561728 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podStartSLOduration=72.561700703 podStartE2EDuration="1m12.561700703s" podCreationTimestamp="2025-11-29 14:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:29:52.536082572 +0000 UTC m=+90.522920304" watchObservedRunningTime="2025-11-29 14:29:52.561700703 +0000 UTC m=+90.548538395" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.562095 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-pngnb" podStartSLOduration=71.562084104 podStartE2EDuration="1m11.562084104s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:29:52.561662682 +0000 UTC m=+90.548500364" watchObservedRunningTime="2025-11-29 14:29:52.562084104 +0000 UTC m=+90.548921796" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.579775 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.579844 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.579867 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.579897 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.579920 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:52Z","lastTransitionTime":"2025-11-29T14:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.601776 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vg6gc" podStartSLOduration=72.601674593 podStartE2EDuration="1m12.601674593s" podCreationTimestamp="2025-11-29 14:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:29:52.578353978 +0000 UTC m=+90.565191670" watchObservedRunningTime="2025-11-29 14:29:52.601674593 +0000 UTC m=+90.588512275" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.629338 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2hghh" podStartSLOduration=71.629310802 podStartE2EDuration="1m11.629310802s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:29:52.601065876 +0000 UTC m=+90.587903628" watchObservedRunningTime="2025-11-29 14:29:52.629310802 +0000 UTC m=+90.616148484" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.682743 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.682789 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.682800 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.682819 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.682833 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:52Z","lastTransitionTime":"2025-11-29T14:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.685296 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=71.685270389 podStartE2EDuration="1m11.685270389s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:29:52.685015851 +0000 UTC m=+90.671853513" watchObservedRunningTime="2025-11-29 14:29:52.685270389 +0000 UTC m=+90.672108081" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.685921 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-c92rh" podStartSLOduration=72.685904227 podStartE2EDuration="1m12.685904227s" podCreationTimestamp="2025-11-29 14:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:29:52.667226344 +0000 UTC m=+90.654064036" watchObservedRunningTime="2025-11-29 14:29:52.685904227 +0000 UTC m=+90.672741929" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.768392 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=71.76836915 podStartE2EDuration="1m11.76836915s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:29:52.768355189 +0000 UTC m=+90.755192851" watchObservedRunningTime="2025-11-29 14:29:52.76836915 +0000 UTC m=+90.755206812" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.785323 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.785379 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.785400 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.785424 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.785478 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:52Z","lastTransitionTime":"2025-11-29T14:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.869773 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-d5zvb" podStartSLOduration=71.869756243 podStartE2EDuration="1m11.869756243s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:29:52.858891663 +0000 UTC m=+90.845729315" watchObservedRunningTime="2025-11-29 14:29:52.869756243 +0000 UTC m=+90.856593895" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.870349 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=38.870345139 podStartE2EDuration="38.870345139s" podCreationTimestamp="2025-11-29 14:29:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:29:52.869872896 +0000 UTC m=+90.856710548" watchObservedRunningTime="2025-11-29 14:29:52.870345139 +0000 UTC m=+90.857182791" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.880907 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=23.88088848 podStartE2EDuration="23.88088848s" podCreationTimestamp="2025-11-29 14:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:29:52.879648685 +0000 UTC m=+90.866486337" watchObservedRunningTime="2025-11-29 14:29:52.88088848 +0000 UTC m=+90.867726132" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.887647 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.887713 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.887732 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.887762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.887781 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:52Z","lastTransitionTime":"2025-11-29T14:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.990542 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.990614 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.990637 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.990673 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:52 crc kubenswrapper[4907]: I1129 14:29:52.990716 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:52Z","lastTransitionTime":"2025-11-29T14:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.093156 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.093225 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.093260 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.093287 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.093305 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:53Z","lastTransitionTime":"2025-11-29T14:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.197299 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.197375 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.197395 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.197425 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.197484 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:53Z","lastTransitionTime":"2025-11-29T14:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.300841 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.300900 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.300919 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.300942 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.300959 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:53Z","lastTransitionTime":"2025-11-29T14:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.404768 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.404853 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.404871 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.404903 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.404921 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:53Z","lastTransitionTime":"2025-11-29T14:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.479327 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.479399 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.479539 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:53 crc kubenswrapper[4907]: E1129 14:29:53.479732 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.479779 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:53 crc kubenswrapper[4907]: E1129 14:29:53.480085 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:53 crc kubenswrapper[4907]: E1129 14:29:53.479980 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:53 crc kubenswrapper[4907]: E1129 14:29:53.480347 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.507794 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.507851 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.507870 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.507952 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.507974 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:53Z","lastTransitionTime":"2025-11-29T14:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.611074 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.611957 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.611973 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.612004 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.612019 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:53Z","lastTransitionTime":"2025-11-29T14:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.715314 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.715374 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.715392 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.715417 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.715471 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:53Z","lastTransitionTime":"2025-11-29T14:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.818051 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.818112 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.818130 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.818154 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.818171 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:53Z","lastTransitionTime":"2025-11-29T14:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.921247 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.921293 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.921310 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.921335 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:53 crc kubenswrapper[4907]: I1129 14:29:53.921355 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:53Z","lastTransitionTime":"2025-11-29T14:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.023799 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.023859 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.023880 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.023907 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.023925 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:54Z","lastTransitionTime":"2025-11-29T14:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.127107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.127169 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.127186 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.127213 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.127230 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:54Z","lastTransitionTime":"2025-11-29T14:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.230282 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.230339 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.230361 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.230386 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.230404 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:54Z","lastTransitionTime":"2025-11-29T14:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.332934 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.332984 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.333003 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.333024 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.333039 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:54Z","lastTransitionTime":"2025-11-29T14:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.437248 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.437335 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.437408 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.437471 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.437606 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:54Z","lastTransitionTime":"2025-11-29T14:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.480535 4907 scope.go:117] "RemoveContainer" containerID="34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92" Nov 29 14:29:54 crc kubenswrapper[4907]: E1129 14:29:54.480794 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.539884 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.539925 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.539936 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.539951 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.539961 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:54Z","lastTransitionTime":"2025-11-29T14:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.642846 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.642903 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.642917 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.642938 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.642953 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:54Z","lastTransitionTime":"2025-11-29T14:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.745529 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.745597 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.745626 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.745722 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.745809 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:54Z","lastTransitionTime":"2025-11-29T14:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.848418 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.848531 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.848557 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.848592 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.848662 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:54Z","lastTransitionTime":"2025-11-29T14:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.951680 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.951739 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.951757 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.951788 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:54 crc kubenswrapper[4907]: I1129 14:29:54.951804 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:54Z","lastTransitionTime":"2025-11-29T14:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.055390 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.055496 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.055518 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.055543 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.055560 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:55Z","lastTransitionTime":"2025-11-29T14:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.158842 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.158928 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.158952 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.158985 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.159045 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:55Z","lastTransitionTime":"2025-11-29T14:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.262496 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.262553 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.262570 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.262596 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.262613 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:55Z","lastTransitionTime":"2025-11-29T14:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.366021 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.366148 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.366166 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.366188 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.366205 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:55Z","lastTransitionTime":"2025-11-29T14:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.468985 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.469063 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.469086 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.469116 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.469138 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:55Z","lastTransitionTime":"2025-11-29T14:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.480032 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.480116 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.480213 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:55 crc kubenswrapper[4907]: E1129 14:29:55.480299 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:55 crc kubenswrapper[4907]: E1129 14:29:55.480490 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:55 crc kubenswrapper[4907]: E1129 14:29:55.480585 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.481299 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:55 crc kubenswrapper[4907]: E1129 14:29:55.481572 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.572160 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.572285 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.572308 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.572332 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.572350 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:55Z","lastTransitionTime":"2025-11-29T14:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.675092 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.675141 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.675163 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.675252 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.675278 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:55Z","lastTransitionTime":"2025-11-29T14:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.779066 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.779148 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.779169 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.779193 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.779212 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:55Z","lastTransitionTime":"2025-11-29T14:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.882164 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.882253 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.882275 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.882296 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.882345 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:55Z","lastTransitionTime":"2025-11-29T14:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.986001 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.986051 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.986068 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.986091 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:55 crc kubenswrapper[4907]: I1129 14:29:55.986108 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:55Z","lastTransitionTime":"2025-11-29T14:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.088634 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.088713 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.088731 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.088754 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.088772 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:56Z","lastTransitionTime":"2025-11-29T14:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.192074 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.192145 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.192165 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.192189 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.192206 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:56Z","lastTransitionTime":"2025-11-29T14:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.295309 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.295377 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.295395 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.295420 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.295466 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:56Z","lastTransitionTime":"2025-11-29T14:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.398976 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.399052 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.399076 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.399107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.399130 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:56Z","lastTransitionTime":"2025-11-29T14:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.501734 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.501776 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.501792 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.501813 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.501829 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:56Z","lastTransitionTime":"2025-11-29T14:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.601258 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.601322 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.601339 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.601365 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.601383 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T14:29:56Z","lastTransitionTime":"2025-11-29T14:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.664506 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=72.664482276 podStartE2EDuration="1m12.664482276s" podCreationTimestamp="2025-11-29 14:28:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:29:52.908068646 +0000 UTC m=+90.894906288" watchObservedRunningTime="2025-11-29 14:29:56.664482276 +0000 UTC m=+94.651319968" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.665575 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg"] Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.666050 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.670533 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.671412 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.671485 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.671513 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.783649 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a1b97929-5ee1-4e89-86be-c76fb345c790-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-smjgg\" (UID: \"a1b97929-5ee1-4e89-86be-c76fb345c790\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.783713 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1b97929-5ee1-4e89-86be-c76fb345c790-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-smjgg\" (UID: \"a1b97929-5ee1-4e89-86be-c76fb345c790\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.783763 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b97929-5ee1-4e89-86be-c76fb345c790-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-smjgg\" (UID: \"a1b97929-5ee1-4e89-86be-c76fb345c790\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.783812 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b97929-5ee1-4e89-86be-c76fb345c790-service-ca\") pod \"cluster-version-operator-5c965bbfc6-smjgg\" (UID: \"a1b97929-5ee1-4e89-86be-c76fb345c790\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.784044 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a1b97929-5ee1-4e89-86be-c76fb345c790-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-smjgg\" (UID: \"a1b97929-5ee1-4e89-86be-c76fb345c790\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.885786 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b97929-5ee1-4e89-86be-c76fb345c790-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-smjgg\" (UID: \"a1b97929-5ee1-4e89-86be-c76fb345c790\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.885858 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b97929-5ee1-4e89-86be-c76fb345c790-service-ca\") pod \"cluster-version-operator-5c965bbfc6-smjgg\" (UID: \"a1b97929-5ee1-4e89-86be-c76fb345c790\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.885959 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a1b97929-5ee1-4e89-86be-c76fb345c790-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-smjgg\" (UID: \"a1b97929-5ee1-4e89-86be-c76fb345c790\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.886000 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1b97929-5ee1-4e89-86be-c76fb345c790-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-smjgg\" (UID: \"a1b97929-5ee1-4e89-86be-c76fb345c790\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.886086 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a1b97929-5ee1-4e89-86be-c76fb345c790-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-smjgg\" (UID: \"a1b97929-5ee1-4e89-86be-c76fb345c790\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.886119 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a1b97929-5ee1-4e89-86be-c76fb345c790-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-smjgg\" (UID: \"a1b97929-5ee1-4e89-86be-c76fb345c790\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.886329 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a1b97929-5ee1-4e89-86be-c76fb345c790-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-smjgg\" (UID: \"a1b97929-5ee1-4e89-86be-c76fb345c790\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.887075 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b97929-5ee1-4e89-86be-c76fb345c790-service-ca\") pod \"cluster-version-operator-5c965bbfc6-smjgg\" (UID: \"a1b97929-5ee1-4e89-86be-c76fb345c790\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.894511 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b97929-5ee1-4e89-86be-c76fb345c790-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-smjgg\" (UID: \"a1b97929-5ee1-4e89-86be-c76fb345c790\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.916707 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1b97929-5ee1-4e89-86be-c76fb345c790-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-smjgg\" (UID: \"a1b97929-5ee1-4e89-86be-c76fb345c790\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:56 crc kubenswrapper[4907]: I1129 14:29:56.990869 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" Nov 29 14:29:57 crc kubenswrapper[4907]: W1129 14:29:57.013363 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b97929_5ee1_4e89_86be_c76fb345c790.slice/crio-e520b0b42d8b31e5674f71c1813d9d5ca5a734755f14bfa7163c54a5901404d0 WatchSource:0}: Error finding container e520b0b42d8b31e5674f71c1813d9d5ca5a734755f14bfa7163c54a5901404d0: Status 404 returned error can't find the container with id e520b0b42d8b31e5674f71c1813d9d5ca5a734755f14bfa7163c54a5901404d0 Nov 29 14:29:57 crc kubenswrapper[4907]: I1129 14:29:57.105138 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" event={"ID":"a1b97929-5ee1-4e89-86be-c76fb345c790","Type":"ContainerStarted","Data":"e520b0b42d8b31e5674f71c1813d9d5ca5a734755f14bfa7163c54a5901404d0"} Nov 29 14:29:57 crc kubenswrapper[4907]: I1129 14:29:57.478636 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:57 crc kubenswrapper[4907]: I1129 14:29:57.479048 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:57 crc kubenswrapper[4907]: I1129 14:29:57.479066 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:57 crc kubenswrapper[4907]: E1129 14:29:57.479222 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:57 crc kubenswrapper[4907]: I1129 14:29:57.479434 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:57 crc kubenswrapper[4907]: E1129 14:29:57.479675 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:57 crc kubenswrapper[4907]: E1129 14:29:57.479560 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:57 crc kubenswrapper[4907]: E1129 14:29:57.480015 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:58 crc kubenswrapper[4907]: I1129 14:29:58.109860 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" event={"ID":"a1b97929-5ee1-4e89-86be-c76fb345c790","Type":"ContainerStarted","Data":"4c99e0ec9c132b568e4d6298008c5f9b2c3a342b2d53e75fb99c8d614a507cc7"} Nov 29 14:29:58 crc kubenswrapper[4907]: I1129 14:29:58.124737 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-smjgg" podStartSLOduration=77.124719131 podStartE2EDuration="1m17.124719131s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:29:58.124548537 +0000 UTC m=+96.111386199" watchObservedRunningTime="2025-11-29 14:29:58.124719131 +0000 UTC m=+96.111556793" Nov 29 14:29:59 crc kubenswrapper[4907]: I1129 14:29:59.478787 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:29:59 crc kubenswrapper[4907]: I1129 14:29:59.478867 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:29:59 crc kubenswrapper[4907]: I1129 14:29:59.478801 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:29:59 crc kubenswrapper[4907]: I1129 14:29:59.478801 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:59 crc kubenswrapper[4907]: E1129 14:29:59.479020 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:29:59 crc kubenswrapper[4907]: E1129 14:29:59.479137 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:29:59 crc kubenswrapper[4907]: E1129 14:29:59.479301 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:29:59 crc kubenswrapper[4907]: E1129 14:29:59.479401 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:29:59 crc kubenswrapper[4907]: I1129 14:29:59.716115 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs\") pod \"network-metrics-daemon-25ct5\" (UID: \"9f50e55a-d427-4cde-a639-d6c7597e937a\") " pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:29:59 crc kubenswrapper[4907]: E1129 14:29:59.716412 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:29:59 crc kubenswrapper[4907]: E1129 14:29:59.716558 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs podName:9f50e55a-d427-4cde-a639-d6c7597e937a nodeName:}" failed. No retries permitted until 2025-11-29 14:31:03.716530349 +0000 UTC m=+161.703368031 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs") pod "network-metrics-daemon-25ct5" (UID: "9f50e55a-d427-4cde-a639-d6c7597e937a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 14:30:01 crc kubenswrapper[4907]: I1129 14:30:01.478722 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:01 crc kubenswrapper[4907]: E1129 14:30:01.479162 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:01 crc kubenswrapper[4907]: I1129 14:30:01.478901 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:01 crc kubenswrapper[4907]: I1129 14:30:01.478810 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:01 crc kubenswrapper[4907]: E1129 14:30:01.479266 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:01 crc kubenswrapper[4907]: I1129 14:30:01.478968 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:01 crc kubenswrapper[4907]: E1129 14:30:01.479507 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:01 crc kubenswrapper[4907]: E1129 14:30:01.479729 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:03 crc kubenswrapper[4907]: I1129 14:30:03.479398 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:03 crc kubenswrapper[4907]: I1129 14:30:03.479523 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:03 crc kubenswrapper[4907]: I1129 14:30:03.479602 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:03 crc kubenswrapper[4907]: E1129 14:30:03.479609 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:03 crc kubenswrapper[4907]: I1129 14:30:03.479623 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:03 crc kubenswrapper[4907]: E1129 14:30:03.479754 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:03 crc kubenswrapper[4907]: E1129 14:30:03.479856 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:03 crc kubenswrapper[4907]: E1129 14:30:03.480263 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:05 crc kubenswrapper[4907]: I1129 14:30:05.479049 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:05 crc kubenswrapper[4907]: I1129 14:30:05.479100 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:05 crc kubenswrapper[4907]: I1129 14:30:05.480306 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:05 crc kubenswrapper[4907]: I1129 14:30:05.480719 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:05 crc kubenswrapper[4907]: E1129 14:30:05.480805 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:05 crc kubenswrapper[4907]: E1129 14:30:05.480894 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:05 crc kubenswrapper[4907]: E1129 14:30:05.480997 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:05 crc kubenswrapper[4907]: E1129 14:30:05.481823 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:05 crc kubenswrapper[4907]: I1129 14:30:05.482828 4907 scope.go:117] "RemoveContainer" containerID="34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92" Nov 29 14:30:05 crc kubenswrapper[4907]: E1129 14:30:05.483078 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" Nov 29 14:30:07 crc kubenswrapper[4907]: I1129 14:30:07.479274 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:07 crc kubenswrapper[4907]: E1129 14:30:07.479486 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:07 crc kubenswrapper[4907]: I1129 14:30:07.479641 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:07 crc kubenswrapper[4907]: E1129 14:30:07.479812 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:07 crc kubenswrapper[4907]: I1129 14:30:07.480061 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:07 crc kubenswrapper[4907]: E1129 14:30:07.480185 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:07 crc kubenswrapper[4907]: I1129 14:30:07.480972 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:07 crc kubenswrapper[4907]: E1129 14:30:07.481356 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:09 crc kubenswrapper[4907]: I1129 14:30:09.478478 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:09 crc kubenswrapper[4907]: I1129 14:30:09.478478 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:09 crc kubenswrapper[4907]: I1129 14:30:09.478558 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:09 crc kubenswrapper[4907]: I1129 14:30:09.478637 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:09 crc kubenswrapper[4907]: E1129 14:30:09.478827 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:09 crc kubenswrapper[4907]: E1129 14:30:09.479185 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:09 crc kubenswrapper[4907]: E1129 14:30:09.479401 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:09 crc kubenswrapper[4907]: E1129 14:30:09.479706 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:11 crc kubenswrapper[4907]: I1129 14:30:11.479315 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:11 crc kubenswrapper[4907]: I1129 14:30:11.479397 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:11 crc kubenswrapper[4907]: E1129 14:30:11.479487 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:11 crc kubenswrapper[4907]: I1129 14:30:11.479347 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:11 crc kubenswrapper[4907]: I1129 14:30:11.479421 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:11 crc kubenswrapper[4907]: E1129 14:30:11.479899 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:11 crc kubenswrapper[4907]: E1129 14:30:11.479930 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:11 crc kubenswrapper[4907]: E1129 14:30:11.480056 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:13 crc kubenswrapper[4907]: I1129 14:30:13.478946 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:13 crc kubenswrapper[4907]: I1129 14:30:13.479050 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:13 crc kubenswrapper[4907]: I1129 14:30:13.479060 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:13 crc kubenswrapper[4907]: I1129 14:30:13.478977 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:13 crc kubenswrapper[4907]: E1129 14:30:13.479187 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:13 crc kubenswrapper[4907]: E1129 14:30:13.479329 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:13 crc kubenswrapper[4907]: E1129 14:30:13.479466 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:13 crc kubenswrapper[4907]: E1129 14:30:13.479623 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:15 crc kubenswrapper[4907]: I1129 14:30:15.179264 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d5zvb_3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4/kube-multus/1.log" Nov 29 14:30:15 crc kubenswrapper[4907]: I1129 14:30:15.180061 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d5zvb_3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4/kube-multus/0.log" Nov 29 14:30:15 crc kubenswrapper[4907]: I1129 14:30:15.180140 4907 generic.go:334] "Generic (PLEG): container finished" podID="3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4" containerID="6d855997e199e8c32067f8e32d958526cdb8a19406794035937f3e7f77cb9bc8" exitCode=1 Nov 29 14:30:15 crc kubenswrapper[4907]: I1129 14:30:15.180186 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5zvb" event={"ID":"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4","Type":"ContainerDied","Data":"6d855997e199e8c32067f8e32d958526cdb8a19406794035937f3e7f77cb9bc8"} Nov 29 14:30:15 crc kubenswrapper[4907]: I1129 14:30:15.180237 4907 scope.go:117] "RemoveContainer" containerID="bf35d0d723fdc10d0300b49c057facc9a0068f4a0bb08dfcffe4fcf26c485717" Nov 29 14:30:15 crc kubenswrapper[4907]: I1129 14:30:15.181871 4907 scope.go:117] "RemoveContainer" containerID="6d855997e199e8c32067f8e32d958526cdb8a19406794035937f3e7f77cb9bc8" Nov 29 14:30:15 crc kubenswrapper[4907]: E1129 14:30:15.185359 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-d5zvb_openshift-multus(3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4)\"" pod="openshift-multus/multus-d5zvb" podUID="3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4" Nov 29 14:30:15 crc kubenswrapper[4907]: I1129 14:30:15.478854 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:15 crc kubenswrapper[4907]: I1129 14:30:15.478862 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:15 crc kubenswrapper[4907]: E1129 14:30:15.479759 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:15 crc kubenswrapper[4907]: I1129 14:30:15.478893 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:15 crc kubenswrapper[4907]: E1129 14:30:15.479858 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:15 crc kubenswrapper[4907]: I1129 14:30:15.478869 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:15 crc kubenswrapper[4907]: E1129 14:30:15.479958 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:15 crc kubenswrapper[4907]: E1129 14:30:15.480579 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:16 crc kubenswrapper[4907]: I1129 14:30:16.187157 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d5zvb_3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4/kube-multus/1.log" Nov 29 14:30:16 crc kubenswrapper[4907]: I1129 14:30:16.480200 4907 scope.go:117] "RemoveContainer" containerID="34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92" Nov 29 14:30:16 crc kubenswrapper[4907]: E1129 14:30:16.480399 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-dtnl8_openshift-ovn-kubernetes(e5339013-9544-4e7e-a449-c257f1086638)\"" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" Nov 29 14:30:17 crc kubenswrapper[4907]: I1129 14:30:17.479590 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:17 crc kubenswrapper[4907]: I1129 14:30:17.479688 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:17 crc kubenswrapper[4907]: I1129 14:30:17.479751 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:17 crc kubenswrapper[4907]: I1129 14:30:17.479626 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:17 crc kubenswrapper[4907]: E1129 14:30:17.479798 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:17 crc kubenswrapper[4907]: E1129 14:30:17.479957 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:17 crc kubenswrapper[4907]: E1129 14:30:17.480042 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:17 crc kubenswrapper[4907]: E1129 14:30:17.480161 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:19 crc kubenswrapper[4907]: I1129 14:30:19.478605 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:19 crc kubenswrapper[4907]: I1129 14:30:19.478656 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:19 crc kubenswrapper[4907]: I1129 14:30:19.478625 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:19 crc kubenswrapper[4907]: I1129 14:30:19.478804 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:19 crc kubenswrapper[4907]: E1129 14:30:19.478860 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:19 crc kubenswrapper[4907]: E1129 14:30:19.479041 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:19 crc kubenswrapper[4907]: E1129 14:30:19.479179 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:19 crc kubenswrapper[4907]: E1129 14:30:19.479353 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:21 crc kubenswrapper[4907]: I1129 14:30:21.478647 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:21 crc kubenswrapper[4907]: I1129 14:30:21.478672 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:21 crc kubenswrapper[4907]: I1129 14:30:21.478724 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:21 crc kubenswrapper[4907]: I1129 14:30:21.478734 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:21 crc kubenswrapper[4907]: E1129 14:30:21.478911 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:21 crc kubenswrapper[4907]: E1129 14:30:21.479074 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:21 crc kubenswrapper[4907]: E1129 14:30:21.479359 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:21 crc kubenswrapper[4907]: E1129 14:30:21.479542 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:22 crc kubenswrapper[4907]: E1129 14:30:22.422764 4907 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 29 14:30:22 crc kubenswrapper[4907]: E1129 14:30:22.583949 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 14:30:23 crc kubenswrapper[4907]: I1129 14:30:23.479125 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:23 crc kubenswrapper[4907]: I1129 14:30:23.479159 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:23 crc kubenswrapper[4907]: I1129 14:30:23.479161 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:23 crc kubenswrapper[4907]: E1129 14:30:23.479319 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:23 crc kubenswrapper[4907]: I1129 14:30:23.479390 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:23 crc kubenswrapper[4907]: E1129 14:30:23.479593 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:23 crc kubenswrapper[4907]: E1129 14:30:23.479824 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:23 crc kubenswrapper[4907]: E1129 14:30:23.479960 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:25 crc kubenswrapper[4907]: I1129 14:30:25.478692 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:25 crc kubenswrapper[4907]: I1129 14:30:25.478791 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:25 crc kubenswrapper[4907]: E1129 14:30:25.478879 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:25 crc kubenswrapper[4907]: E1129 14:30:25.479048 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:25 crc kubenswrapper[4907]: I1129 14:30:25.478831 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:25 crc kubenswrapper[4907]: I1129 14:30:25.479655 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:25 crc kubenswrapper[4907]: E1129 14:30:25.479830 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:25 crc kubenswrapper[4907]: E1129 14:30:25.480030 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:27 crc kubenswrapper[4907]: I1129 14:30:27.479019 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:27 crc kubenswrapper[4907]: I1129 14:30:27.479071 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:27 crc kubenswrapper[4907]: I1129 14:30:27.479130 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:27 crc kubenswrapper[4907]: I1129 14:30:27.479228 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:27 crc kubenswrapper[4907]: E1129 14:30:27.479243 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:27 crc kubenswrapper[4907]: E1129 14:30:27.479388 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:27 crc kubenswrapper[4907]: E1129 14:30:27.479554 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:27 crc kubenswrapper[4907]: E1129 14:30:27.479674 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:27 crc kubenswrapper[4907]: E1129 14:30:27.585358 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 14:30:29 crc kubenswrapper[4907]: I1129 14:30:29.478784 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:29 crc kubenswrapper[4907]: I1129 14:30:29.478856 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:29 crc kubenswrapper[4907]: I1129 14:30:29.478916 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:29 crc kubenswrapper[4907]: E1129 14:30:29.478945 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:29 crc kubenswrapper[4907]: E1129 14:30:29.479109 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:29 crc kubenswrapper[4907]: E1129 14:30:29.479269 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:29 crc kubenswrapper[4907]: I1129 14:30:29.479361 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:29 crc kubenswrapper[4907]: I1129 14:30:29.479560 4907 scope.go:117] "RemoveContainer" containerID="6d855997e199e8c32067f8e32d958526cdb8a19406794035937f3e7f77cb9bc8" Nov 29 14:30:29 crc kubenswrapper[4907]: E1129 14:30:29.479560 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:30 crc kubenswrapper[4907]: I1129 14:30:30.244357 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d5zvb_3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4/kube-multus/1.log" Nov 29 14:30:30 crc kubenswrapper[4907]: I1129 14:30:30.244488 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5zvb" event={"ID":"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4","Type":"ContainerStarted","Data":"758c2a8240a7ddc01c0eefe154215e74709991c70756567f3ce4c50d9d63ef7f"} Nov 29 14:30:30 crc kubenswrapper[4907]: I1129 14:30:30.481947 4907 scope.go:117] "RemoveContainer" containerID="34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92" Nov 29 14:30:31 crc kubenswrapper[4907]: I1129 14:30:31.252649 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovnkube-controller/3.log" Nov 29 14:30:31 crc kubenswrapper[4907]: I1129 14:30:31.256716 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerStarted","Data":"855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d"} Nov 29 14:30:31 crc kubenswrapper[4907]: I1129 14:30:31.257222 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:30:31 crc kubenswrapper[4907]: I1129 14:30:31.295381 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podStartSLOduration=110.295351773 podStartE2EDuration="1m50.295351773s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:31.292894743 +0000 UTC m=+129.279732435" watchObservedRunningTime="2025-11-29 14:30:31.295351773 +0000 UTC m=+129.282189465" Nov 29 14:30:31 crc kubenswrapper[4907]: I1129 14:30:31.479551 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:31 crc kubenswrapper[4907]: I1129 14:30:31.479625 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:31 crc kubenswrapper[4907]: I1129 14:30:31.479571 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:31 crc kubenswrapper[4907]: I1129 14:30:31.479556 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:31 crc kubenswrapper[4907]: E1129 14:30:31.479747 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:31 crc kubenswrapper[4907]: E1129 14:30:31.479896 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:31 crc kubenswrapper[4907]: E1129 14:30:31.480029 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:31 crc kubenswrapper[4907]: E1129 14:30:31.480185 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:31 crc kubenswrapper[4907]: I1129 14:30:31.507273 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-25ct5"] Nov 29 14:30:32 crc kubenswrapper[4907]: I1129 14:30:32.259617 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:32 crc kubenswrapper[4907]: E1129 14:30:32.259779 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:32 crc kubenswrapper[4907]: E1129 14:30:32.586201 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 14:30:33 crc kubenswrapper[4907]: I1129 14:30:33.479352 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:33 crc kubenswrapper[4907]: I1129 14:30:33.479505 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:33 crc kubenswrapper[4907]: E1129 14:30:33.479568 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:33 crc kubenswrapper[4907]: I1129 14:30:33.479596 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:33 crc kubenswrapper[4907]: I1129 14:30:33.479657 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:33 crc kubenswrapper[4907]: E1129 14:30:33.479857 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:33 crc kubenswrapper[4907]: E1129 14:30:33.479941 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:33 crc kubenswrapper[4907]: E1129 14:30:33.480050 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:35 crc kubenswrapper[4907]: I1129 14:30:35.478936 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:35 crc kubenswrapper[4907]: E1129 14:30:35.479782 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:35 crc kubenswrapper[4907]: I1129 14:30:35.479827 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:35 crc kubenswrapper[4907]: I1129 14:30:35.479931 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:35 crc kubenswrapper[4907]: E1129 14:30:35.480023 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:35 crc kubenswrapper[4907]: I1129 14:30:35.480076 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:35 crc kubenswrapper[4907]: E1129 14:30:35.480104 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:35 crc kubenswrapper[4907]: E1129 14:30:35.480253 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:37 crc kubenswrapper[4907]: I1129 14:30:37.479359 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:37 crc kubenswrapper[4907]: I1129 14:30:37.479481 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:37 crc kubenswrapper[4907]: E1129 14:30:37.479541 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 14:30:37 crc kubenswrapper[4907]: I1129 14:30:37.479668 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:37 crc kubenswrapper[4907]: I1129 14:30:37.479758 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:37 crc kubenswrapper[4907]: E1129 14:30:37.479661 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25ct5" podUID="9f50e55a-d427-4cde-a639-d6c7597e937a" Nov 29 14:30:37 crc kubenswrapper[4907]: E1129 14:30:37.479831 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 14:30:37 crc kubenswrapper[4907]: E1129 14:30:37.479967 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 14:30:38 crc kubenswrapper[4907]: I1129 14:30:38.427125 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:30:39 crc kubenswrapper[4907]: I1129 14:30:39.479382 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:39 crc kubenswrapper[4907]: I1129 14:30:39.479507 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:39 crc kubenswrapper[4907]: I1129 14:30:39.479532 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:39 crc kubenswrapper[4907]: I1129 14:30:39.479539 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:30:39 crc kubenswrapper[4907]: I1129 14:30:39.482206 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 29 14:30:39 crc kubenswrapper[4907]: I1129 14:30:39.482504 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 29 14:30:39 crc kubenswrapper[4907]: I1129 14:30:39.483157 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 29 14:30:39 crc kubenswrapper[4907]: I1129 14:30:39.484706 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 29 14:30:39 crc kubenswrapper[4907]: I1129 14:30:39.485942 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 29 14:30:39 crc kubenswrapper[4907]: I1129 14:30:39.485986 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.378098 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.428624 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mtvpt"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.429367 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.437380 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.451130 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jcdhm"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.451614 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/28aad01a-534b-4d04-aceb-ad163db9871c-image-import-ca\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.451702 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28aad01a-534b-4d04-aceb-ad163db9871c-serving-cert\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.451732 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/28aad01a-534b-4d04-aceb-ad163db9871c-audit\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.451753 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28aad01a-534b-4d04-aceb-ad163db9871c-etcd-client\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.451782 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85hth\" (UniqueName: \"kubernetes.io/projected/28aad01a-534b-4d04-aceb-ad163db9871c-kube-api-access-85hth\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.451803 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28aad01a-534b-4d04-aceb-ad163db9871c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.451850 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28aad01a-534b-4d04-aceb-ad163db9871c-config\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.451889 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/28aad01a-534b-4d04-aceb-ad163db9871c-encryption-config\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.451944 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/28aad01a-534b-4d04-aceb-ad163db9871c-node-pullsecrets\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.451985 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/28aad01a-534b-4d04-aceb-ad163db9871c-etcd-serving-ca\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.452006 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28aad01a-534b-4d04-aceb-ad163db9871c-audit-dir\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.452561 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.458793 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gpnkx"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.459538 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.460287 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.460423 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.460743 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.461111 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.461154 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.461263 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.461333 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.461568 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.461808 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.461911 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.462683 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.462759 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.462843 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.462922 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.471220 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.471458 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.471622 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.471657 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.471774 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.471994 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.472022 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.474187 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.474661 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.475013 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.475431 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.480519 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.481117 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.481630 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.482110 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.482852 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.482863 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.483928 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.484252 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.484813 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.485079 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.485290 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.485508 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.485555 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.488201 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.488458 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.489240 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.489284 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.489510 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.489698 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.489852 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.490306 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.490473 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.490609 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.491509 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-5t44g"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.492189 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.493540 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c572b"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.494118 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.496301 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.500853 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-rthjj"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.501058 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.501318 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hhl5h"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.501599 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.501743 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.501864 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.501901 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.501929 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.501951 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.502723 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.502916 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.503073 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.503939 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rthjj" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.504712 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j46xp"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.505060 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.505152 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.505465 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pm8bc"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.505894 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-j46xp" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.506027 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.510346 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.510816 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qdmt6"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.511296 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.510824 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pm8bc" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.515772 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.510975 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.518536 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.519739 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.519822 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.520041 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.521874 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.528147 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.528639 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.534799 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.538887 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.551605 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.552181 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.553318 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.553497 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.553864 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.554064 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.554160 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.554254 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.554363 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.554499 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.554611 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.554781 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.554989 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.555067 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.555204 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.555765 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.556065 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mgntg"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.556365 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/28aad01a-534b-4d04-aceb-ad163db9871c-audit\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.556710 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.556122 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.557750 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.557802 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pv2xm\" (UID: \"2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.558568 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73987028-5c99-42b5-b871-213b94cd4826-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lr54b\" (UID: \"73987028-5c99-42b5-b871-213b94cd4826\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.558607 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.558648 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccf1336b-0e9f-42d7-adca-45e017360773-config\") pod \"kube-controller-manager-operator-78b949d7b-djhlh\" (UID: \"ccf1336b-0e9f-42d7-adca-45e017360773\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.558673 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-oauth-serving-cert\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.558698 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/73987028-5c99-42b5-b871-213b94cd4826-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lr54b\" (UID: \"73987028-5c99-42b5-b871-213b94cd4826\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.558721 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dvh9\" (UniqueName: \"kubernetes.io/projected/73987028-5c99-42b5-b871-213b94cd4826-kube-api-access-8dvh9\") pod \"cluster-image-registry-operator-dc59b4c8b-lr54b\" (UID: \"73987028-5c99-42b5-b871-213b94cd4826\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.558743 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.558766 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad127416-8f9c-4d2d-a13a-8a0b69525847-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.558792 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2mm6\" (UniqueName: \"kubernetes.io/projected/2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef-kube-api-access-d2mm6\") pod \"openshift-controller-manager-operator-756b6f6bc6-pv2xm\" (UID: \"2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.558817 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ad127416-8f9c-4d2d-a13a-8a0b69525847-etcd-client\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.558856 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ad127416-8f9c-4d2d-a13a-8a0b69525847-encryption-config\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.558878 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72fe0643-91a8-459e-aec7-257e5b07ea41-config\") pod \"route-controller-manager-6576b87f9c-q2sxr\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.558899 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccf1336b-0e9f-42d7-adca-45e017360773-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-djhlh\" (UID: \"ccf1336b-0e9f-42d7-adca-45e017360773\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.558942 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.559020 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec75ca64-71c8-4a23-8723-378e0450548b-config\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.559080 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad127416-8f9c-4d2d-a13a-8a0b69525847-serving-cert\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.559127 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvzfj\" (UniqueName: \"kubernetes.io/projected/72fe0643-91a8-459e-aec7-257e5b07ea41-kube-api-access-fvzfj\") pod \"route-controller-manager-6576b87f9c-q2sxr\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.559161 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-trusted-ca-bundle\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.559182 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec75ca64-71c8-4a23-8723-378e0450548b-etcd-service-ca\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.559207 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.559260 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/28aad01a-534b-4d04-aceb-ad163db9871c-etcd-serving-ca\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.559952 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/28aad01a-534b-4d04-aceb-ad163db9871c-etcd-serving-ca\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.560009 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.560057 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad127416-8f9c-4d2d-a13a-8a0b69525847-audit-dir\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.560081 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c70da9a-ff96-432f-81ad-382c70754e70-console-serving-cert\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.560101 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0c70da9a-ff96-432f-81ad-382c70754e70-console-oauth-config\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.560135 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72fe0643-91a8-459e-aec7-257e5b07ea41-serving-cert\") pod \"route-controller-manager-6576b87f9c-q2sxr\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.560178 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-service-ca\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.560228 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28aad01a-534b-4d04-aceb-ad163db9871c-etcd-client\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.560275 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec75ca64-71c8-4a23-8723-378e0450548b-serving-cert\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.561710 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.561834 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/28aad01a-534b-4d04-aceb-ad163db9871c-audit\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562184 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd7f8\" (UniqueName: \"kubernetes.io/projected/0c70da9a-ff96-432f-81ad-382c70754e70-kube-api-access-bd7f8\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562221 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ec75ca64-71c8-4a23-8723-378e0450548b-etcd-ca\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562277 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pv2xm\" (UID: \"2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562298 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ec75ca64-71c8-4a23-8723-378e0450548b-etcd-client\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562335 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562371 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85hth\" (UniqueName: \"kubernetes.io/projected/28aad01a-534b-4d04-aceb-ad163db9871c-kube-api-access-85hth\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562407 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28aad01a-534b-4d04-aceb-ad163db9871c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562481 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73987028-5c99-42b5-b871-213b94cd4826-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lr54b\" (UID: \"73987028-5c99-42b5-b871-213b94cd4826\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562555 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72fe0643-91a8-459e-aec7-257e5b07ea41-client-ca\") pod \"route-controller-manager-6576b87f9c-q2sxr\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562578 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccf1336b-0e9f-42d7-adca-45e017360773-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-djhlh\" (UID: \"ccf1336b-0e9f-42d7-adca-45e017360773\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562594 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562685 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28aad01a-534b-4d04-aceb-ad163db9871c-config\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562726 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-console-config\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562756 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562783 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ad127416-8f9c-4d2d-a13a-8a0b69525847-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.562808 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z22hh\" (UniqueName: \"kubernetes.io/projected/ec75ca64-71c8-4a23-8723-378e0450548b-kube-api-access-z22hh\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.563053 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-audit-policies\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.563100 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/28aad01a-534b-4d04-aceb-ad163db9871c-encryption-config\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.563723 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28aad01a-534b-4d04-aceb-ad163db9871c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.563769 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28aad01a-534b-4d04-aceb-ad163db9871c-config\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.563851 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/28aad01a-534b-4d04-aceb-ad163db9871c-node-pullsecrets\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.563881 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.563941 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66db1fbb-f050-4af3-977b-831602348a9b-audit-dir\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.563984 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/28aad01a-534b-4d04-aceb-ad163db9871c-node-pullsecrets\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.564010 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ad127416-8f9c-4d2d-a13a-8a0b69525847-audit-policies\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.564070 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28aad01a-534b-4d04-aceb-ad163db9871c-audit-dir\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.564111 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28aad01a-534b-4d04-aceb-ad163db9871c-audit-dir\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.564114 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqxlj\" (UniqueName: \"kubernetes.io/projected/ad127416-8f9c-4d2d-a13a-8a0b69525847-kube-api-access-fqxlj\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.564149 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.564186 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.564242 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnl6v\" (UniqueName: \"kubernetes.io/projected/66db1fbb-f050-4af3-977b-831602348a9b-kube-api-access-tnl6v\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.564271 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/28aad01a-534b-4d04-aceb-ad163db9871c-image-import-ca\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.565017 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28aad01a-534b-4d04-aceb-ad163db9871c-serving-cert\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.565148 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/28aad01a-534b-4d04-aceb-ad163db9871c-image-import-ca\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.571530 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-56l8g"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.571978 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.572280 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.572284 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/28aad01a-534b-4d04-aceb-ad163db9871c-encryption-config\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.572382 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.572689 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hc9t9"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.572854 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28aad01a-534b-4d04-aceb-ad163db9871c-etcd-client\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.573706 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.573721 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28aad01a-534b-4d04-aceb-ad163db9871c-serving-cert\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.573896 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.573970 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.574086 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.576972 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hc9t9" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.577293 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.578497 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.578652 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.578733 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.578762 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.578874 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.578892 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.578972 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.578994 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.579020 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.578691 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.579120 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.579162 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.579273 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.579593 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.580068 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.580130 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.580292 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.580784 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.580909 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.589621 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.590804 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.591059 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.593213 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.595729 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.597401 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.600051 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.601859 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.600482 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.601529 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.609212 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.610489 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.611489 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.612242 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.613041 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.615676 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-dwtxw"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.616209 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.628956 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jcdhm"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.637327 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dkwzk"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.637885 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75hhz"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.637994 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.638170 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtr2r"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.638295 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dkwzk" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.638498 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75hhz" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.638535 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.640147 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.640661 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.640873 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.641021 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.642020 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.642770 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.644630 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.645407 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.645602 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.646066 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mtvpt"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.646183 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.646824 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jcqhw"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.647474 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.647771 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.648419 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.649301 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.649724 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.650879 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.651266 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.652154 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jrw7z"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.652951 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jrw7z" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.654037 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.655112 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.656722 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hhl5h"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.657197 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-56l8g"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.658482 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qdmt6"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.659384 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.662335 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pm8bc"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.667827 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2mm6\" (UniqueName: \"kubernetes.io/projected/2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef-kube-api-access-d2mm6\") pod \"openshift-controller-manager-operator-756b6f6bc6-pv2xm\" (UID: \"2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.667860 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8000091-319c-4799-8068-77dd150352f4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6mjqv\" (UID: \"e8000091-319c-4799-8068-77dd150352f4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.667882 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fd9557ee-7350-479a-80d7-20ae8c6a229d-signing-cabundle\") pod \"service-ca-9c57cc56f-jcqhw\" (UID: \"fd9557ee-7350-479a-80d7-20ae8c6a229d\") " pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.667901 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ad127416-8f9c-4d2d-a13a-8a0b69525847-etcd-client\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.667918 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ad127416-8f9c-4d2d-a13a-8a0b69525847-encryption-config\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.667933 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72fe0643-91a8-459e-aec7-257e5b07ea41-config\") pod \"route-controller-manager-6576b87f9c-q2sxr\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668004 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccf1336b-0e9f-42d7-adca-45e017360773-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-djhlh\" (UID: \"ccf1336b-0e9f-42d7-adca-45e017360773\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668046 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668064 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5769473-e380-4d0e-bfe4-aab057473a62-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jcdhm\" (UID: \"d5769473-e380-4d0e-bfe4-aab057473a62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668081 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b815226-63ff-4e97-bdcd-4a48ee001b99-config\") pod \"service-ca-operator-777779d784-pkrnj\" (UID: \"9b815226-63ff-4e97-bdcd-4a48ee001b99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668104 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k4fg\" (UniqueName: \"kubernetes.io/projected/7a422846-a8dc-47ce-912d-6444fb22b575-kube-api-access-9k4fg\") pod \"machine-approver-56656f9798-8jxfs\" (UID: \"7a422846-a8dc-47ce-912d-6444fb22b575\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668118 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-config\") pod \"controller-manager-879f6c89f-gpnkx\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668134 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db0351fd-226e-4fc2-b8f6-0132252c4e2e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2v2jw\" (UID: \"db0351fd-226e-4fc2-b8f6-0132252c4e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668151 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4t25\" (UniqueName: \"kubernetes.io/projected/9b815226-63ff-4e97-bdcd-4a48ee001b99-kube-api-access-k4t25\") pod \"service-ca-operator-777779d784-pkrnj\" (UID: \"9b815226-63ff-4e97-bdcd-4a48ee001b99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668169 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec75ca64-71c8-4a23-8723-378e0450548b-config\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668192 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad127416-8f9c-4d2d-a13a-8a0b69525847-serving-cert\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668207 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvzfj\" (UniqueName: \"kubernetes.io/projected/72fe0643-91a8-459e-aec7-257e5b07ea41-kube-api-access-fvzfj\") pod \"route-controller-manager-6576b87f9c-q2sxr\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668223 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-trusted-ca-bundle\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668236 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec75ca64-71c8-4a23-8723-378e0450548b-etcd-service-ca\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668253 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668270 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5769473-e380-4d0e-bfe4-aab057473a62-config\") pod \"machine-api-operator-5694c8668f-jcdhm\" (UID: \"d5769473-e380-4d0e-bfe4-aab057473a62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668288 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5addff88-f08e-4fea-bc20-fb09e5e7d504-config\") pod \"console-operator-58897d9998-hhl5h\" (UID: \"5addff88-f08e-4fea-bc20-fb09e5e7d504\") " pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668306 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa61d693-3aab-4537-9075-3f99fde5cb8d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pm8bc\" (UID: \"fa61d693-3aab-4537-9075-3f99fde5cb8d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pm8bc" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668320 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g6mg\" (UniqueName: \"kubernetes.io/projected/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-kube-api-access-2g6mg\") pod \"controller-manager-879f6c89f-gpnkx\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668336 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668355 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2mx2\" (UniqueName: \"kubernetes.io/projected/db0351fd-226e-4fc2-b8f6-0132252c4e2e-kube-api-access-p2mx2\") pod \"ingress-operator-5b745b69d9-2v2jw\" (UID: \"db0351fd-226e-4fc2-b8f6-0132252c4e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668371 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad127416-8f9c-4d2d-a13a-8a0b69525847-audit-dir\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668387 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c70da9a-ff96-432f-81ad-382c70754e70-console-serving-cert\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668400 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0c70da9a-ff96-432f-81ad-382c70754e70-console-oauth-config\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668417 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72fe0643-91a8-459e-aec7-257e5b07ea41-serving-cert\") pod \"route-controller-manager-6576b87f9c-q2sxr\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668433 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-service-ca\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668467 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec75ca64-71c8-4a23-8723-378e0450548b-serving-cert\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668486 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87211b95-db9a-4429-be07-91b57e6355c3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vjc6g\" (UID: \"87211b95-db9a-4429-be07-91b57e6355c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668501 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd7f8\" (UniqueName: \"kubernetes.io/projected/0c70da9a-ff96-432f-81ad-382c70754e70-kube-api-access-bd7f8\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668515 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ec75ca64-71c8-4a23-8723-378e0450548b-etcd-ca\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668531 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47k75\" (UniqueName: \"kubernetes.io/projected/e8000091-319c-4799-8068-77dd150352f4-kube-api-access-47k75\") pod \"openshift-apiserver-operator-796bbdcf4f-6mjqv\" (UID: \"e8000091-319c-4799-8068-77dd150352f4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668552 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pv2xm\" (UID: \"2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668566 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ec75ca64-71c8-4a23-8723-378e0450548b-etcd-client\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668583 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668599 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwbq5\" (UniqueName: \"kubernetes.io/projected/fa61d693-3aab-4537-9075-3f99fde5cb8d-kube-api-access-fwbq5\") pod \"cluster-samples-operator-665b6dd947-pm8bc\" (UID: \"fa61d693-3aab-4537-9075-3f99fde5cb8d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pm8bc" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668613 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87211b95-db9a-4429-be07-91b57e6355c3-config\") pod \"kube-apiserver-operator-766d6c64bb-vjc6g\" (UID: \"87211b95-db9a-4429-be07-91b57e6355c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668630 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73987028-5c99-42b5-b871-213b94cd4826-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lr54b\" (UID: \"73987028-5c99-42b5-b871-213b94cd4826\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668645 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87211b95-db9a-4429-be07-91b57e6355c3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vjc6g\" (UID: \"87211b95-db9a-4429-be07-91b57e6355c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668661 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b815226-63ff-4e97-bdcd-4a48ee001b99-serving-cert\") pod \"service-ca-operator-777779d784-pkrnj\" (UID: \"9b815226-63ff-4e97-bdcd-4a48ee001b99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668677 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72fe0643-91a8-459e-aec7-257e5b07ea41-client-ca\") pod \"route-controller-manager-6576b87f9c-q2sxr\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668693 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccf1336b-0e9f-42d7-adca-45e017360773-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-djhlh\" (UID: \"ccf1336b-0e9f-42d7-adca-45e017360773\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668743 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668763 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7a422846-a8dc-47ce-912d-6444fb22b575-machine-approver-tls\") pod \"machine-approver-56656f9798-8jxfs\" (UID: \"7a422846-a8dc-47ce-912d-6444fb22b575\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668778 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-client-ca\") pod \"controller-manager-879f6c89f-gpnkx\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668795 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-console-config\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668812 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668826 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9486f\" (UniqueName: \"kubernetes.io/projected/d5769473-e380-4d0e-bfe4-aab057473a62-kube-api-access-9486f\") pod \"machine-api-operator-5694c8668f-jcdhm\" (UID: \"d5769473-e380-4d0e-bfe4-aab057473a62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668841 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed407316-6a6a-47dc-99eb-6be44d35b22f-metrics-tls\") pod \"dns-operator-744455d44c-j46xp\" (UID: \"ed407316-6a6a-47dc-99eb-6be44d35b22f\") " pod="openshift-dns-operator/dns-operator-744455d44c-j46xp" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668860 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a422846-a8dc-47ce-912d-6444fb22b575-config\") pod \"machine-approver-56656f9798-8jxfs\" (UID: \"7a422846-a8dc-47ce-912d-6444fb22b575\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668884 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ad127416-8f9c-4d2d-a13a-8a0b69525847-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668902 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z22hh\" (UniqueName: \"kubernetes.io/projected/ec75ca64-71c8-4a23-8723-378e0450548b-kube-api-access-z22hh\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668918 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-audit-policies\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668936 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzlrd\" (UniqueName: \"kubernetes.io/projected/ed407316-6a6a-47dc-99eb-6be44d35b22f-kube-api-access-nzlrd\") pod \"dns-operator-744455d44c-j46xp\" (UID: \"ed407316-6a6a-47dc-99eb-6be44d35b22f\") " pod="openshift-dns-operator/dns-operator-744455d44c-j46xp" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668952 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7a422846-a8dc-47ce-912d-6444fb22b575-auth-proxy-config\") pod \"machine-approver-56656f9798-8jxfs\" (UID: \"7a422846-a8dc-47ce-912d-6444fb22b575\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668968 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gpnkx\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.668994 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669012 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db0351fd-226e-4fc2-b8f6-0132252c4e2e-metrics-tls\") pod \"ingress-operator-5b745b69d9-2v2jw\" (UID: \"db0351fd-226e-4fc2-b8f6-0132252c4e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669028 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66db1fbb-f050-4af3-977b-831602348a9b-audit-dir\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669042 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fd9557ee-7350-479a-80d7-20ae8c6a229d-signing-key\") pod \"service-ca-9c57cc56f-jcqhw\" (UID: \"fd9557ee-7350-479a-80d7-20ae8c6a229d\") " pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669059 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-serving-cert\") pod \"controller-manager-879f6c89f-gpnkx\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669075 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ad127416-8f9c-4d2d-a13a-8a0b69525847-audit-policies\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669092 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqxlj\" (UniqueName: \"kubernetes.io/projected/ad127416-8f9c-4d2d-a13a-8a0b69525847-kube-api-access-fqxlj\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669108 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669124 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669139 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5addff88-f08e-4fea-bc20-fb09e5e7d504-serving-cert\") pod \"console-operator-58897d9998-hhl5h\" (UID: \"5addff88-f08e-4fea-bc20-fb09e5e7d504\") " pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669155 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnl6v\" (UniqueName: \"kubernetes.io/projected/66db1fbb-f050-4af3-977b-831602348a9b-kube-api-access-tnl6v\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669170 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d5769473-e380-4d0e-bfe4-aab057473a62-images\") pod \"machine-api-operator-5694c8668f-jcdhm\" (UID: \"d5769473-e380-4d0e-bfe4-aab057473a62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669185 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8000091-319c-4799-8068-77dd150352f4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6mjqv\" (UID: \"e8000091-319c-4799-8068-77dd150352f4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669200 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhj8j\" (UniqueName: \"kubernetes.io/projected/fd9557ee-7350-479a-80d7-20ae8c6a229d-kube-api-access-xhj8j\") pod \"service-ca-9c57cc56f-jcqhw\" (UID: \"fd9557ee-7350-479a-80d7-20ae8c6a229d\") " pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669223 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db0351fd-226e-4fc2-b8f6-0132252c4e2e-trusted-ca\") pod \"ingress-operator-5b745b69d9-2v2jw\" (UID: \"db0351fd-226e-4fc2-b8f6-0132252c4e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669239 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r27x\" (UniqueName: \"kubernetes.io/projected/565df8ea-1d38-4c6d-98a6-d63a58c8df03-kube-api-access-9r27x\") pod \"downloads-7954f5f757-rthjj\" (UID: \"565df8ea-1d38-4c6d-98a6-d63a58c8df03\") " pod="openshift-console/downloads-7954f5f757-rthjj" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669257 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pv2xm\" (UID: \"2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669273 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73987028-5c99-42b5-b871-213b94cd4826-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lr54b\" (UID: \"73987028-5c99-42b5-b871-213b94cd4826\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669290 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669291 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec75ca64-71c8-4a23-8723-378e0450548b-etcd-service-ca\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669306 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5addff88-f08e-4fea-bc20-fb09e5e7d504-trusted-ca\") pod \"console-operator-58897d9998-hhl5h\" (UID: \"5addff88-f08e-4fea-bc20-fb09e5e7d504\") " pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669387 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccf1336b-0e9f-42d7-adca-45e017360773-config\") pod \"kube-controller-manager-operator-78b949d7b-djhlh\" (UID: \"ccf1336b-0e9f-42d7-adca-45e017360773\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669422 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-oauth-serving-cert\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669472 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad127416-8f9c-4d2d-a13a-8a0b69525847-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669502 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/73987028-5c99-42b5-b871-213b94cd4826-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lr54b\" (UID: \"73987028-5c99-42b5-b871-213b94cd4826\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669534 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dvh9\" (UniqueName: \"kubernetes.io/projected/73987028-5c99-42b5-b871-213b94cd4826-kube-api-access-8dvh9\") pod \"cluster-image-registry-operator-dc59b4c8b-lr54b\" (UID: \"73987028-5c99-42b5-b871-213b94cd4826\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669585 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.669626 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss97s\" (UniqueName: \"kubernetes.io/projected/5addff88-f08e-4fea-bc20-fb09e5e7d504-kube-api-access-ss97s\") pod \"console-operator-58897d9998-hhl5h\" (UID: \"5addff88-f08e-4fea-bc20-fb09e5e7d504\") " pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.672099 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ad127416-8f9c-4d2d-a13a-8a0b69525847-encryption-config\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.672951 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-oauth-serving-cert\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.673596 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ad127416-8f9c-4d2d-a13a-8a0b69525847-audit-policies\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.673816 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73987028-5c99-42b5-b871-213b94cd4826-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lr54b\" (UID: \"73987028-5c99-42b5-b871-213b94cd4826\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.673857 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.674045 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-console-config\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.674889 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-trusted-ca-bundle\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.674974 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pv2xm\" (UID: \"2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.674909 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72fe0643-91a8-459e-aec7-257e5b07ea41-client-ca\") pod \"route-controller-manager-6576b87f9c-q2sxr\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.675411 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ec75ca64-71c8-4a23-8723-378e0450548b-etcd-ca\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.675563 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ad127416-8f9c-4d2d-a13a-8a0b69525847-etcd-client\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.675862 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.675965 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66db1fbb-f050-4af3-977b-831602348a9b-audit-dir\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.676055 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccf1336b-0e9f-42d7-adca-45e017360773-config\") pod \"kube-controller-manager-operator-78b949d7b-djhlh\" (UID: \"ccf1336b-0e9f-42d7-adca-45e017360773\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.676072 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad127416-8f9c-4d2d-a13a-8a0b69525847-serving-cert\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.676097 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad127416-8f9c-4d2d-a13a-8a0b69525847-audit-dir\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.676391 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad127416-8f9c-4d2d-a13a-8a0b69525847-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.676651 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ad127416-8f9c-4d2d-a13a-8a0b69525847-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.677217 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.677693 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.677901 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-audit-policies\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.678594 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.678813 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec75ca64-71c8-4a23-8723-378e0450548b-config\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.679105 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.679125 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hc9t9"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.679296 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.679409 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c70da9a-ff96-432f-81ad-382c70754e70-console-serving-cert\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.680034 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.680034 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-service-ca\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.680049 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/73987028-5c99-42b5-b871-213b94cd4826-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lr54b\" (UID: \"73987028-5c99-42b5-b871-213b94cd4826\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.680480 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72fe0643-91a8-459e-aec7-257e5b07ea41-config\") pod \"route-controller-manager-6576b87f9c-q2sxr\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.680212 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pv2xm\" (UID: \"2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.680078 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gpnkx"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.681337 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.681852 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0c70da9a-ff96-432f-81ad-382c70754e70-console-oauth-config\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.681978 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j46xp"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.683055 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.683201 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mgntg"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.684537 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75hhz"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.685878 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jrw7z"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.686009 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ec75ca64-71c8-4a23-8723-378e0450548b-etcd-client\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.686252 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec75ca64-71c8-4a23-8723-378e0450548b-serving-cert\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.686336 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.686963 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.688638 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.693157 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.693196 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c572b"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.693218 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.697953 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72fe0643-91a8-459e-aec7-257e5b07ea41-serving-cert\") pod \"route-controller-manager-6576b87f9c-q2sxr\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.699545 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.699837 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.700892 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.701921 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.702932 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.703938 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.704966 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5t44g"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.706085 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtr2r"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.707132 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.708118 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.709143 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dkwzk"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.710141 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jcqhw"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.710752 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccf1336b-0e9f-42d7-adca-45e017360773-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-djhlh\" (UID: \"ccf1336b-0e9f-42d7-adca-45e017360773\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.711088 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.711098 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.712468 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.713742 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jxwmf"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.714660 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jxwmf" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.715046 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wl27q"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.715942 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.716492 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.717548 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jxwmf"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.718657 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wl27q"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.719729 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rthjj"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.720751 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.721727 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-2j64h"] Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.722295 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2j64h" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.730401 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.751302 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770122 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5769473-e380-4d0e-bfe4-aab057473a62-config\") pod \"machine-api-operator-5694c8668f-jcdhm\" (UID: \"d5769473-e380-4d0e-bfe4-aab057473a62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770164 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5addff88-f08e-4fea-bc20-fb09e5e7d504-config\") pod \"console-operator-58897d9998-hhl5h\" (UID: \"5addff88-f08e-4fea-bc20-fb09e5e7d504\") " pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770183 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa61d693-3aab-4537-9075-3f99fde5cb8d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pm8bc\" (UID: \"fa61d693-3aab-4537-9075-3f99fde5cb8d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pm8bc" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770202 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g6mg\" (UniqueName: \"kubernetes.io/projected/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-kube-api-access-2g6mg\") pod \"controller-manager-879f6c89f-gpnkx\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770220 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2mx2\" (UniqueName: \"kubernetes.io/projected/db0351fd-226e-4fc2-b8f6-0132252c4e2e-kube-api-access-p2mx2\") pod \"ingress-operator-5b745b69d9-2v2jw\" (UID: \"db0351fd-226e-4fc2-b8f6-0132252c4e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770240 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87211b95-db9a-4429-be07-91b57e6355c3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vjc6g\" (UID: \"87211b95-db9a-4429-be07-91b57e6355c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770255 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47k75\" (UniqueName: \"kubernetes.io/projected/e8000091-319c-4799-8068-77dd150352f4-kube-api-access-47k75\") pod \"openshift-apiserver-operator-796bbdcf4f-6mjqv\" (UID: \"e8000091-319c-4799-8068-77dd150352f4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770276 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwbq5\" (UniqueName: \"kubernetes.io/projected/fa61d693-3aab-4537-9075-3f99fde5cb8d-kube-api-access-fwbq5\") pod \"cluster-samples-operator-665b6dd947-pm8bc\" (UID: \"fa61d693-3aab-4537-9075-3f99fde5cb8d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pm8bc" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770289 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87211b95-db9a-4429-be07-91b57e6355c3-config\") pod \"kube-apiserver-operator-766d6c64bb-vjc6g\" (UID: \"87211b95-db9a-4429-be07-91b57e6355c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770309 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87211b95-db9a-4429-be07-91b57e6355c3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vjc6g\" (UID: \"87211b95-db9a-4429-be07-91b57e6355c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770323 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b815226-63ff-4e97-bdcd-4a48ee001b99-serving-cert\") pod \"service-ca-operator-777779d784-pkrnj\" (UID: \"9b815226-63ff-4e97-bdcd-4a48ee001b99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770345 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7a422846-a8dc-47ce-912d-6444fb22b575-machine-approver-tls\") pod \"machine-approver-56656f9798-8jxfs\" (UID: \"7a422846-a8dc-47ce-912d-6444fb22b575\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770360 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-client-ca\") pod \"controller-manager-879f6c89f-gpnkx\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770379 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9486f\" (UniqueName: \"kubernetes.io/projected/d5769473-e380-4d0e-bfe4-aab057473a62-kube-api-access-9486f\") pod \"machine-api-operator-5694c8668f-jcdhm\" (UID: \"d5769473-e380-4d0e-bfe4-aab057473a62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770393 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed407316-6a6a-47dc-99eb-6be44d35b22f-metrics-tls\") pod \"dns-operator-744455d44c-j46xp\" (UID: \"ed407316-6a6a-47dc-99eb-6be44d35b22f\") " pod="openshift-dns-operator/dns-operator-744455d44c-j46xp" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770408 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a422846-a8dc-47ce-912d-6444fb22b575-config\") pod \"machine-approver-56656f9798-8jxfs\" (UID: \"7a422846-a8dc-47ce-912d-6444fb22b575\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770432 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzlrd\" (UniqueName: \"kubernetes.io/projected/ed407316-6a6a-47dc-99eb-6be44d35b22f-kube-api-access-nzlrd\") pod \"dns-operator-744455d44c-j46xp\" (UID: \"ed407316-6a6a-47dc-99eb-6be44d35b22f\") " pod="openshift-dns-operator/dns-operator-744455d44c-j46xp" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770475 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7a422846-a8dc-47ce-912d-6444fb22b575-auth-proxy-config\") pod \"machine-approver-56656f9798-8jxfs\" (UID: \"7a422846-a8dc-47ce-912d-6444fb22b575\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770491 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gpnkx\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770508 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db0351fd-226e-4fc2-b8f6-0132252c4e2e-metrics-tls\") pod \"ingress-operator-5b745b69d9-2v2jw\" (UID: \"db0351fd-226e-4fc2-b8f6-0132252c4e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770531 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fd9557ee-7350-479a-80d7-20ae8c6a229d-signing-key\") pod \"service-ca-9c57cc56f-jcqhw\" (UID: \"fd9557ee-7350-479a-80d7-20ae8c6a229d\") " pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770547 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-serving-cert\") pod \"controller-manager-879f6c89f-gpnkx\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770570 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5addff88-f08e-4fea-bc20-fb09e5e7d504-serving-cert\") pod \"console-operator-58897d9998-hhl5h\" (UID: \"5addff88-f08e-4fea-bc20-fb09e5e7d504\") " pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770585 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d5769473-e380-4d0e-bfe4-aab057473a62-images\") pod \"machine-api-operator-5694c8668f-jcdhm\" (UID: \"d5769473-e380-4d0e-bfe4-aab057473a62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770600 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8000091-319c-4799-8068-77dd150352f4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6mjqv\" (UID: \"e8000091-319c-4799-8068-77dd150352f4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770615 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhj8j\" (UniqueName: \"kubernetes.io/projected/fd9557ee-7350-479a-80d7-20ae8c6a229d-kube-api-access-xhj8j\") pod \"service-ca-9c57cc56f-jcqhw\" (UID: \"fd9557ee-7350-479a-80d7-20ae8c6a229d\") " pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770644 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db0351fd-226e-4fc2-b8f6-0132252c4e2e-trusted-ca\") pod \"ingress-operator-5b745b69d9-2v2jw\" (UID: \"db0351fd-226e-4fc2-b8f6-0132252c4e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770662 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r27x\" (UniqueName: \"kubernetes.io/projected/565df8ea-1d38-4c6d-98a6-d63a58c8df03-kube-api-access-9r27x\") pod \"downloads-7954f5f757-rthjj\" (UID: \"565df8ea-1d38-4c6d-98a6-d63a58c8df03\") " pod="openshift-console/downloads-7954f5f757-rthjj" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770682 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5addff88-f08e-4fea-bc20-fb09e5e7d504-trusted-ca\") pod \"console-operator-58897d9998-hhl5h\" (UID: \"5addff88-f08e-4fea-bc20-fb09e5e7d504\") " pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770705 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss97s\" (UniqueName: \"kubernetes.io/projected/5addff88-f08e-4fea-bc20-fb09e5e7d504-kube-api-access-ss97s\") pod \"console-operator-58897d9998-hhl5h\" (UID: \"5addff88-f08e-4fea-bc20-fb09e5e7d504\") " pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770726 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8000091-319c-4799-8068-77dd150352f4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6mjqv\" (UID: \"e8000091-319c-4799-8068-77dd150352f4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770741 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fd9557ee-7350-479a-80d7-20ae8c6a229d-signing-cabundle\") pod \"service-ca-9c57cc56f-jcqhw\" (UID: \"fd9557ee-7350-479a-80d7-20ae8c6a229d\") " pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770767 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5769473-e380-4d0e-bfe4-aab057473a62-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jcdhm\" (UID: \"d5769473-e380-4d0e-bfe4-aab057473a62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770783 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b815226-63ff-4e97-bdcd-4a48ee001b99-config\") pod \"service-ca-operator-777779d784-pkrnj\" (UID: \"9b815226-63ff-4e97-bdcd-4a48ee001b99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770802 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k4fg\" (UniqueName: \"kubernetes.io/projected/7a422846-a8dc-47ce-912d-6444fb22b575-kube-api-access-9k4fg\") pod \"machine-approver-56656f9798-8jxfs\" (UID: \"7a422846-a8dc-47ce-912d-6444fb22b575\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770817 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-config\") pod \"controller-manager-879f6c89f-gpnkx\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770834 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db0351fd-226e-4fc2-b8f6-0132252c4e2e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2v2jw\" (UID: \"db0351fd-226e-4fc2-b8f6-0132252c4e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.770850 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4t25\" (UniqueName: \"kubernetes.io/projected/9b815226-63ff-4e97-bdcd-4a48ee001b99-kube-api-access-k4t25\") pod \"service-ca-operator-777779d784-pkrnj\" (UID: \"9b815226-63ff-4e97-bdcd-4a48ee001b99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.771759 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5769473-e380-4d0e-bfe4-aab057473a62-config\") pod \"machine-api-operator-5694c8668f-jcdhm\" (UID: \"d5769473-e380-4d0e-bfe4-aab057473a62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.772313 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5addff88-f08e-4fea-bc20-fb09e5e7d504-config\") pod \"console-operator-58897d9998-hhl5h\" (UID: \"5addff88-f08e-4fea-bc20-fb09e5e7d504\") " pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.773592 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a422846-a8dc-47ce-912d-6444fb22b575-config\") pod \"machine-approver-56656f9798-8jxfs\" (UID: \"7a422846-a8dc-47ce-912d-6444fb22b575\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.774275 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.774520 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5addff88-f08e-4fea-bc20-fb09e5e7d504-trusted-ca\") pod \"console-operator-58897d9998-hhl5h\" (UID: \"5addff88-f08e-4fea-bc20-fb09e5e7d504\") " pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.774970 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8000091-319c-4799-8068-77dd150352f4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6mjqv\" (UID: \"e8000091-319c-4799-8068-77dd150352f4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.775388 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-client-ca\") pod \"controller-manager-879f6c89f-gpnkx\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.775506 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gpnkx\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.775654 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d5769473-e380-4d0e-bfe4-aab057473a62-images\") pod \"machine-api-operator-5694c8668f-jcdhm\" (UID: \"d5769473-e380-4d0e-bfe4-aab057473a62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.775664 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87211b95-db9a-4429-be07-91b57e6355c3-config\") pod \"kube-apiserver-operator-766d6c64bb-vjc6g\" (UID: \"87211b95-db9a-4429-be07-91b57e6355c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.775863 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7a422846-a8dc-47ce-912d-6444fb22b575-auth-proxy-config\") pod \"machine-approver-56656f9798-8jxfs\" (UID: \"7a422846-a8dc-47ce-912d-6444fb22b575\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.776698 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-config\") pod \"controller-manager-879f6c89f-gpnkx\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.776813 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87211b95-db9a-4429-be07-91b57e6355c3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vjc6g\" (UID: \"87211b95-db9a-4429-be07-91b57e6355c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.777116 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed407316-6a6a-47dc-99eb-6be44d35b22f-metrics-tls\") pod \"dns-operator-744455d44c-j46xp\" (UID: \"ed407316-6a6a-47dc-99eb-6be44d35b22f\") " pod="openshift-dns-operator/dns-operator-744455d44c-j46xp" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.777118 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8000091-319c-4799-8068-77dd150352f4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6mjqv\" (UID: \"e8000091-319c-4799-8068-77dd150352f4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.777617 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5addff88-f08e-4fea-bc20-fb09e5e7d504-serving-cert\") pod \"console-operator-58897d9998-hhl5h\" (UID: \"5addff88-f08e-4fea-bc20-fb09e5e7d504\") " pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.777895 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa61d693-3aab-4537-9075-3f99fde5cb8d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pm8bc\" (UID: \"fa61d693-3aab-4537-9075-3f99fde5cb8d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pm8bc" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.779036 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5769473-e380-4d0e-bfe4-aab057473a62-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jcdhm\" (UID: \"d5769473-e380-4d0e-bfe4-aab057473a62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.780181 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7a422846-a8dc-47ce-912d-6444fb22b575-machine-approver-tls\") pod \"machine-approver-56656f9798-8jxfs\" (UID: \"7a422846-a8dc-47ce-912d-6444fb22b575\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.780956 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-serving-cert\") pod \"controller-manager-879f6c89f-gpnkx\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.790801 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.810867 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.835471 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.851759 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.872202 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.910154 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85hth\" (UniqueName: \"kubernetes.io/projected/28aad01a-534b-4d04-aceb-ad163db9871c-kube-api-access-85hth\") pod \"apiserver-76f77b778f-mtvpt\" (UID: \"28aad01a-534b-4d04-aceb-ad163db9871c\") " pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.911422 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.932916 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.951553 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.971778 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 29 14:30:47 crc kubenswrapper[4907]: I1129 14:30:47.992030 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.011195 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.031925 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.051301 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.076294 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.092276 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.112776 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.133260 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.152582 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.172275 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.192630 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.214713 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.232509 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.251838 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.275029 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.292033 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.311593 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.332799 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.354782 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.355282 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mtvpt"] Nov 29 14:30:48 crc kubenswrapper[4907]: W1129 14:30:48.368870 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28aad01a_534b_4d04_aceb_ad163db9871c.slice/crio-3b43d6b09748ca8284857daa7a3b6f8e997be705403d3f68e33c1dd2c0863f57 WatchSource:0}: Error finding container 3b43d6b09748ca8284857daa7a3b6f8e997be705403d3f68e33c1dd2c0863f57: Status 404 returned error can't find the container with id 3b43d6b09748ca8284857daa7a3b6f8e997be705403d3f68e33c1dd2c0863f57 Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.372348 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.392214 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.412412 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.432344 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.452638 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.472608 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.491560 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.511470 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.531190 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.551301 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.571235 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.591710 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.611204 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.631996 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.649959 4907 request.go:700] Waited for 1.011091494s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/secrets?fieldSelector=metadata.name%3Dmarketplace-operator-dockercfg-5nsgg&limit=500&resourceVersion=0 Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.651781 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.670866 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.691548 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.710609 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.739706 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.752965 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.771542 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 29 14:30:48 crc kubenswrapper[4907]: E1129 14:30:48.773469 4907 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 29 14:30:48 crc kubenswrapper[4907]: E1129 14:30:48.773519 4907 secret.go:188] Couldn't get secret openshift-ingress-operator/metrics-tls: failed to sync secret cache: timed out waiting for the condition Nov 29 14:30:48 crc kubenswrapper[4907]: E1129 14:30:48.773553 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b815226-63ff-4e97-bdcd-4a48ee001b99-serving-cert podName:9b815226-63ff-4e97-bdcd-4a48ee001b99 nodeName:}" failed. No retries permitted until 2025-11-29 14:30:49.273524059 +0000 UTC m=+147.260361741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/9b815226-63ff-4e97-bdcd-4a48ee001b99-serving-cert") pod "service-ca-operator-777779d784-pkrnj" (UID: "9b815226-63ff-4e97-bdcd-4a48ee001b99") : failed to sync secret cache: timed out waiting for the condition Nov 29 14:30:48 crc kubenswrapper[4907]: E1129 14:30:48.773608 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db0351fd-226e-4fc2-b8f6-0132252c4e2e-metrics-tls podName:db0351fd-226e-4fc2-b8f6-0132252c4e2e nodeName:}" failed. No retries permitted until 2025-11-29 14:30:49.273579261 +0000 UTC m=+147.260416993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/db0351fd-226e-4fc2-b8f6-0132252c4e2e-metrics-tls") pod "ingress-operator-5b745b69d9-2v2jw" (UID: "db0351fd-226e-4fc2-b8f6-0132252c4e2e") : failed to sync secret cache: timed out waiting for the condition Nov 29 14:30:48 crc kubenswrapper[4907]: E1129 14:30:48.773770 4907 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Nov 29 14:30:48 crc kubenswrapper[4907]: E1129 14:30:48.773861 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd9557ee-7350-479a-80d7-20ae8c6a229d-signing-key podName:fd9557ee-7350-479a-80d7-20ae8c6a229d nodeName:}" failed. No retries permitted until 2025-11-29 14:30:49.273834771 +0000 UTC m=+147.260672463 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/fd9557ee-7350-479a-80d7-20ae8c6a229d-signing-key") pod "service-ca-9c57cc56f-jcqhw" (UID: "fd9557ee-7350-479a-80d7-20ae8c6a229d") : failed to sync secret cache: timed out waiting for the condition Nov 29 14:30:48 crc kubenswrapper[4907]: E1129 14:30:48.774746 4907 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Nov 29 14:30:48 crc kubenswrapper[4907]: E1129 14:30:48.774785 4907 configmap.go:193] Couldn't get configMap openshift-ingress-operator/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Nov 29 14:30:48 crc kubenswrapper[4907]: E1129 14:30:48.774831 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fd9557ee-7350-479a-80d7-20ae8c6a229d-signing-cabundle podName:fd9557ee-7350-479a-80d7-20ae8c6a229d nodeName:}" failed. No retries permitted until 2025-11-29 14:30:49.274809861 +0000 UTC m=+147.261647543 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/fd9557ee-7350-479a-80d7-20ae8c6a229d-signing-cabundle") pod "service-ca-9c57cc56f-jcqhw" (UID: "fd9557ee-7350-479a-80d7-20ae8c6a229d") : failed to sync configmap cache: timed out waiting for the condition Nov 29 14:30:48 crc kubenswrapper[4907]: E1129 14:30:48.774865 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db0351fd-226e-4fc2-b8f6-0132252c4e2e-trusted-ca podName:db0351fd-226e-4fc2-b8f6-0132252c4e2e nodeName:}" failed. No retries permitted until 2025-11-29 14:30:49.274846022 +0000 UTC m=+147.261683704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/db0351fd-226e-4fc2-b8f6-0132252c4e2e-trusted-ca") pod "ingress-operator-5b745b69d9-2v2jw" (UID: "db0351fd-226e-4fc2-b8f6-0132252c4e2e") : failed to sync configmap cache: timed out waiting for the condition Nov 29 14:30:48 crc kubenswrapper[4907]: E1129 14:30:48.775947 4907 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Nov 29 14:30:48 crc kubenswrapper[4907]: E1129 14:30:48.776025 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9b815226-63ff-4e97-bdcd-4a48ee001b99-config podName:9b815226-63ff-4e97-bdcd-4a48ee001b99 nodeName:}" failed. No retries permitted until 2025-11-29 14:30:49.276007439 +0000 UTC m=+147.262845131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/9b815226-63ff-4e97-bdcd-4a48ee001b99-config") pod "service-ca-operator-777779d784-pkrnj" (UID: "9b815226-63ff-4e97-bdcd-4a48ee001b99") : failed to sync configmap cache: timed out waiting for the condition Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.792734 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.812243 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.838895 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.851763 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.872587 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.894486 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.912694 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.932548 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.951994 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.972283 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 14:30:48 crc kubenswrapper[4907]: I1129 14:30:48.992151 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.013043 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.032130 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.052250 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.072508 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.092688 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.111977 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.132260 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.152472 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.171506 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.191787 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.211530 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.231787 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.251418 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.273237 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.292385 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.301181 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b815226-63ff-4e97-bdcd-4a48ee001b99-serving-cert\") pod \"service-ca-operator-777779d784-pkrnj\" (UID: \"9b815226-63ff-4e97-bdcd-4a48ee001b99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.301290 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db0351fd-226e-4fc2-b8f6-0132252c4e2e-metrics-tls\") pod \"ingress-operator-5b745b69d9-2v2jw\" (UID: \"db0351fd-226e-4fc2-b8f6-0132252c4e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.301345 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fd9557ee-7350-479a-80d7-20ae8c6a229d-signing-key\") pod \"service-ca-9c57cc56f-jcqhw\" (UID: \"fd9557ee-7350-479a-80d7-20ae8c6a229d\") " pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.301464 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db0351fd-226e-4fc2-b8f6-0132252c4e2e-trusted-ca\") pod \"ingress-operator-5b745b69d9-2v2jw\" (UID: \"db0351fd-226e-4fc2-b8f6-0132252c4e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.301547 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fd9557ee-7350-479a-80d7-20ae8c6a229d-signing-cabundle\") pod \"service-ca-9c57cc56f-jcqhw\" (UID: \"fd9557ee-7350-479a-80d7-20ae8c6a229d\") " pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.301597 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b815226-63ff-4e97-bdcd-4a48ee001b99-config\") pod \"service-ca-operator-777779d784-pkrnj\" (UID: \"9b815226-63ff-4e97-bdcd-4a48ee001b99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.302880 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b815226-63ff-4e97-bdcd-4a48ee001b99-config\") pod \"service-ca-operator-777779d784-pkrnj\" (UID: \"9b815226-63ff-4e97-bdcd-4a48ee001b99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.303477 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fd9557ee-7350-479a-80d7-20ae8c6a229d-signing-cabundle\") pod \"service-ca-9c57cc56f-jcqhw\" (UID: \"fd9557ee-7350-479a-80d7-20ae8c6a229d\") " pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.304052 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db0351fd-226e-4fc2-b8f6-0132252c4e2e-trusted-ca\") pod \"ingress-operator-5b745b69d9-2v2jw\" (UID: \"db0351fd-226e-4fc2-b8f6-0132252c4e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.314381 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b815226-63ff-4e97-bdcd-4a48ee001b99-serving-cert\") pod \"service-ca-operator-777779d784-pkrnj\" (UID: \"9b815226-63ff-4e97-bdcd-4a48ee001b99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.314878 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db0351fd-226e-4fc2-b8f6-0132252c4e2e-metrics-tls\") pod \"ingress-operator-5b745b69d9-2v2jw\" (UID: \"db0351fd-226e-4fc2-b8f6-0132252c4e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.315348 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fd9557ee-7350-479a-80d7-20ae8c6a229d-signing-key\") pod \"service-ca-9c57cc56f-jcqhw\" (UID: \"fd9557ee-7350-479a-80d7-20ae8c6a229d\") " pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.337718 4907 generic.go:334] "Generic (PLEG): container finished" podID="28aad01a-534b-4d04-aceb-ad163db9871c" containerID="de8d898a3ba6f2c7cde67407e8124bda08259bcd99ed0d79a0701dde4bb8d1c8" exitCode=0 Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.337785 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" event={"ID":"28aad01a-534b-4d04-aceb-ad163db9871c","Type":"ContainerDied","Data":"de8d898a3ba6f2c7cde67407e8124bda08259bcd99ed0d79a0701dde4bb8d1c8"} Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.337830 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" event={"ID":"28aad01a-534b-4d04-aceb-ad163db9871c","Type":"ContainerStarted","Data":"3b43d6b09748ca8284857daa7a3b6f8e997be705403d3f68e33c1dd2c0863f57"} Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.366743 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2mm6\" (UniqueName: \"kubernetes.io/projected/2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef-kube-api-access-d2mm6\") pod \"openshift-controller-manager-operator-756b6f6bc6-pv2xm\" (UID: \"2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.381835 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqxlj\" (UniqueName: \"kubernetes.io/projected/ad127416-8f9c-4d2d-a13a-8a0b69525847-kube-api-access-fqxlj\") pod \"apiserver-7bbb656c7d-4p48m\" (UID: \"ad127416-8f9c-4d2d-a13a-8a0b69525847\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.402619 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd7f8\" (UniqueName: \"kubernetes.io/projected/0c70da9a-ff96-432f-81ad-382c70754e70-kube-api-access-bd7f8\") pod \"console-f9d7485db-5t44g\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.413855 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.418633 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvzfj\" (UniqueName: \"kubernetes.io/projected/72fe0643-91a8-459e-aec7-257e5b07ea41-kube-api-access-fvzfj\") pod \"route-controller-manager-6576b87f9c-q2sxr\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.428712 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.444862 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccf1336b-0e9f-42d7-adca-45e017360773-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-djhlh\" (UID: \"ccf1336b-0e9f-42d7-adca-45e017360773\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.460984 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z22hh\" (UniqueName: \"kubernetes.io/projected/ec75ca64-71c8-4a23-8723-378e0450548b-kube-api-access-z22hh\") pod \"etcd-operator-b45778765-c572b\" (UID: \"ec75ca64-71c8-4a23-8723-378e0450548b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.472767 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73987028-5c99-42b5-b871-213b94cd4826-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lr54b\" (UID: \"73987028-5c99-42b5-b871-213b94cd4826\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.495085 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dvh9\" (UniqueName: \"kubernetes.io/projected/73987028-5c99-42b5-b871-213b94cd4826-kube-api-access-8dvh9\") pod \"cluster-image-registry-operator-dc59b4c8b-lr54b\" (UID: \"73987028-5c99-42b5-b871-213b94cd4826\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.513415 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.514342 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnl6v\" (UniqueName: \"kubernetes.io/projected/66db1fbb-f050-4af3-977b-831602348a9b-kube-api-access-tnl6v\") pod \"oauth-openshift-558db77b4-qdmt6\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.516068 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.538114 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.542290 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.549652 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.552099 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.555929 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.571538 4907 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.591074 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.605092 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.605227 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.605271 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.605340 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.605369 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:49 crc kubenswrapper[4907]: E1129 14:30:49.606179 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:32:51.606137975 +0000 UTC m=+269.592975627 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.606789 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.617100 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.617311 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.617370 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.622656 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.631537 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.650038 4907 request.go:700] Waited for 1.92752768s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dnode-bootstrapper-token&limit=500&resourceVersion=0 Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.655730 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.671858 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.672969 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.699834 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.704363 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5t44g"] Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.705185 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m"] Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.712906 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4t25\" (UniqueName: \"kubernetes.io/projected/9b815226-63ff-4e97-bdcd-4a48ee001b99-kube-api-access-k4t25\") pod \"service-ca-operator-777779d784-pkrnj\" (UID: \"9b815226-63ff-4e97-bdcd-4a48ee001b99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.713124 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.721375 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.728916 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g6mg\" (UniqueName: \"kubernetes.io/projected/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-kube-api-access-2g6mg\") pod \"controller-manager-879f6c89f-gpnkx\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.738247 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.753863 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2mx2\" (UniqueName: \"kubernetes.io/projected/db0351fd-226e-4fc2-b8f6-0132252c4e2e-kube-api-access-p2mx2\") pod \"ingress-operator-5b745b69d9-2v2jw\" (UID: \"db0351fd-226e-4fc2-b8f6-0132252c4e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.770567 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9486f\" (UniqueName: \"kubernetes.io/projected/d5769473-e380-4d0e-bfe4-aab057473a62-kube-api-access-9486f\") pod \"machine-api-operator-5694c8668f-jcdhm\" (UID: \"d5769473-e380-4d0e-bfe4-aab057473a62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.773836 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b"] Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.793713 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwbq5\" (UniqueName: \"kubernetes.io/projected/fa61d693-3aab-4537-9075-3f99fde5cb8d-kube-api-access-fwbq5\") pod \"cluster-samples-operator-665b6dd947-pm8bc\" (UID: \"fa61d693-3aab-4537-9075-3f99fde5cb8d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pm8bc" Nov 29 14:30:49 crc kubenswrapper[4907]: W1129 14:30:49.812766 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73987028_5c99_42b5_b871_213b94cd4826.slice/crio-b21ad5cc8dcedc343a02ee5aba77346d88767d9a4df43283bee7b54819477576 WatchSource:0}: Error finding container b21ad5cc8dcedc343a02ee5aba77346d88767d9a4df43283bee7b54819477576: Status 404 returned error can't find the container with id b21ad5cc8dcedc343a02ee5aba77346d88767d9a4df43283bee7b54819477576 Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.825278 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pm8bc" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.826083 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47k75\" (UniqueName: \"kubernetes.io/projected/e8000091-319c-4799-8068-77dd150352f4-kube-api-access-47k75\") pod \"openshift-apiserver-operator-796bbdcf4f-6mjqv\" (UID: \"e8000091-319c-4799-8068-77dd150352f4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.840618 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzlrd\" (UniqueName: \"kubernetes.io/projected/ed407316-6a6a-47dc-99eb-6be44d35b22f-kube-api-access-nzlrd\") pod \"dns-operator-744455d44c-j46xp\" (UID: \"ed407316-6a6a-47dc-99eb-6be44d35b22f\") " pod="openshift-dns-operator/dns-operator-744455d44c-j46xp" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.849358 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhj8j\" (UniqueName: \"kubernetes.io/projected/fd9557ee-7350-479a-80d7-20ae8c6a229d-kube-api-access-xhj8j\") pod \"service-ca-9c57cc56f-jcqhw\" (UID: \"fd9557ee-7350-479a-80d7-20ae8c6a229d\") " pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.878824 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87211b95-db9a-4429-be07-91b57e6355c3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vjc6g\" (UID: \"87211b95-db9a-4429-be07-91b57e6355c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.887404 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss97s\" (UniqueName: \"kubernetes.io/projected/5addff88-f08e-4fea-bc20-fb09e5e7d504-kube-api-access-ss97s\") pod \"console-operator-58897d9998-hhl5h\" (UID: \"5addff88-f08e-4fea-bc20-fb09e5e7d504\") " pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.902399 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.910200 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k4fg\" (UniqueName: \"kubernetes.io/projected/7a422846-a8dc-47ce-912d-6444fb22b575-kube-api-access-9k4fg\") pod \"machine-approver-56656f9798-8jxfs\" (UID: \"7a422846-a8dc-47ce-912d-6444fb22b575\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.911697 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.916826 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh"] Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.928779 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db0351fd-226e-4fc2-b8f6-0132252c4e2e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2v2jw\" (UID: \"db0351fd-226e-4fc2-b8f6-0132252c4e2e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.939148 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.946280 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r27x\" (UniqueName: \"kubernetes.io/projected/565df8ea-1d38-4c6d-98a6-d63a58c8df03-kube-api-access-9r27x\") pod \"downloads-7954f5f757-rthjj\" (UID: \"565df8ea-1d38-4c6d-98a6-d63a58c8df03\") " pod="openshift-console/downloads-7954f5f757-rthjj" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.949191 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr"] Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.954595 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.976038 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" Nov 29 14:30:49 crc kubenswrapper[4907]: I1129 14:30:49.984396 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020344 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bd0fc3f-a365-48e4-bbe2-1509a739460a-serving-cert\") pod \"openshift-config-operator-7777fb866f-mvnwh\" (UID: \"2bd0fc3f-a365-48e4-bbe2-1509a739460a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020374 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f45e1ce1-c477-4a87-a3ab-821e702ce490-service-ca-bundle\") pod \"router-default-5444994796-dwtxw\" (UID: \"f45e1ce1-c477-4a87-a3ab-821e702ce490\") " pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020412 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a259660c-b57b-4a89-9f33-19d3bb3f5a93-secret-volume\") pod \"collect-profiles-29407110-d75hh\" (UID: \"a259660c-b57b-4a89-9f33-19d3bb3f5a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020446 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8dba76-208a-4a46-8f25-50beb0d28ae6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kd4xm\" (UID: \"cf8dba76-208a-4a46-8f25-50beb0d28ae6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020463 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1d7892f-1c0b-4106-b34d-b309f9c60807-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45ssg\" (UID: \"c1d7892f-1c0b-4106-b34d-b309f9c60807\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020521 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f45e1ce1-c477-4a87-a3ab-821e702ce490-metrics-certs\") pod \"router-default-5444994796-dwtxw\" (UID: \"f45e1ce1-c477-4a87-a3ab-821e702ce490\") " pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020537 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c67f836d-7f9f-4827-a8d5-0f3934f96b14-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-t4lwh\" (UID: \"c67f836d-7f9f-4827-a8d5-0f3934f96b14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020558 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020580 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a83e8666-c5ff-45bf-a1ed-584a2554e6aa-service-ca-bundle\") pod \"authentication-operator-69f744f599-mgntg\" (UID: \"a83e8666-c5ff-45bf-a1ed-584a2554e6aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020639 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-bound-sa-token\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020659 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f45e1ce1-c477-4a87-a3ab-821e702ce490-stats-auth\") pod \"router-default-5444994796-dwtxw\" (UID: \"f45e1ce1-c477-4a87-a3ab-821e702ce490\") " pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020690 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-trusted-ca\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020707 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2bd0fc3f-a365-48e4-bbe2-1509a739460a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mvnwh\" (UID: \"2bd0fc3f-a365-48e4-bbe2-1509a739460a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020726 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlmxl\" (UniqueName: \"kubernetes.io/projected/f45e1ce1-c477-4a87-a3ab-821e702ce490-kube-api-access-vlmxl\") pod \"router-default-5444994796-dwtxw\" (UID: \"f45e1ce1-c477-4a87-a3ab-821e702ce490\") " pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020745 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rql7s\" (UniqueName: \"kubernetes.io/projected/cf8dba76-208a-4a46-8f25-50beb0d28ae6-kube-api-access-rql7s\") pod \"kube-storage-version-migrator-operator-b67b599dd-kd4xm\" (UID: \"cf8dba76-208a-4a46-8f25-50beb0d28ae6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020760 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fdc08015-ab62-4a36-a1f6-e17514ac47dd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dkwzk\" (UID: \"fdc08015-ab62-4a36-a1f6-e17514ac47dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dkwzk" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020777 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4dl5\" (UniqueName: \"kubernetes.io/projected/c67f836d-7f9f-4827-a8d5-0f3934f96b14-kube-api-access-v4dl5\") pod \"machine-config-controller-84d6567774-t4lwh\" (UID: \"c67f836d-7f9f-4827-a8d5-0f3934f96b14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020828 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qtr2r\" (UID: \"2c98c8e5-b9f1-43ca-93f6-cb74695dd076\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020846 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1d7892f-1c0b-4106-b34d-b309f9c60807-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45ssg\" (UID: \"c1d7892f-1c0b-4106-b34d-b309f9c60807\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020865 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfd7r\" (UniqueName: \"kubernetes.io/projected/d36c3ddb-8f3c-477d-bf22-1e336519a83e-kube-api-access-tfd7r\") pod \"migrator-59844c95c7-hc9t9\" (UID: \"d36c3ddb-8f3c-477d-bf22-1e336519a83e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hc9t9" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020916 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-registry-certificates\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020931 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020948 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c8db7d8-f286-4ca7-81ad-76198fcc5178-srv-cert\") pod \"catalog-operator-68c6474976-f54qw\" (UID: \"6c8db7d8-f286-4ca7-81ad-76198fcc5178\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.020980 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a259660c-b57b-4a89-9f33-19d3bb3f5a93-config-volume\") pod \"collect-profiles-29407110-d75hh\" (UID: \"a259660c-b57b-4a89-9f33-19d3bb3f5a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021040 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c3efd626-bb2f-4257-ac3a-fbf6d7a76780-tmpfs\") pod \"packageserver-d55dfcdfc-54x97\" (UID: \"c3efd626-bb2f-4257-ac3a-fbf6d7a76780\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021103 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c3efd626-bb2f-4257-ac3a-fbf6d7a76780-apiservice-cert\") pod \"packageserver-d55dfcdfc-54x97\" (UID: \"c3efd626-bb2f-4257-ac3a-fbf6d7a76780\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021121 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/26174366-e2ac-42d4-b0aa-ce6dd5e50ade-images\") pod \"machine-config-operator-74547568cd-2l24g\" (UID: \"26174366-e2ac-42d4-b0aa-ce6dd5e50ade\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021245 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/00ad88f1-cc82-4e00-8b0a-ddb548db43fa-srv-cert\") pod \"olm-operator-6b444d44fb-vw9j7\" (UID: \"00ad88f1-cc82-4e00-8b0a-ddb548db43fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021276 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhdk2\" (UniqueName: \"kubernetes.io/projected/00ad88f1-cc82-4e00-8b0a-ddb548db43fa-kube-api-access-nhdk2\") pod \"olm-operator-6b444d44fb-vw9j7\" (UID: \"00ad88f1-cc82-4e00-8b0a-ddb548db43fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021343 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021377 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a83e8666-c5ff-45bf-a1ed-584a2554e6aa-config\") pod \"authentication-operator-69f744f599-mgntg\" (UID: \"a83e8666-c5ff-45bf-a1ed-584a2554e6aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021429 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26174366-e2ac-42d4-b0aa-ce6dd5e50ade-proxy-tls\") pod \"machine-config-operator-74547568cd-2l24g\" (UID: \"26174366-e2ac-42d4-b0aa-ce6dd5e50ade\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021472 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6c8db7d8-f286-4ca7-81ad-76198fcc5178-profile-collector-cert\") pod \"catalog-operator-68c6474976-f54qw\" (UID: \"6c8db7d8-f286-4ca7-81ad-76198fcc5178\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021487 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vck5q\" (UniqueName: \"kubernetes.io/projected/6c8db7d8-f286-4ca7-81ad-76198fcc5178-kube-api-access-vck5q\") pod \"catalog-operator-68c6474976-f54qw\" (UID: \"6c8db7d8-f286-4ca7-81ad-76198fcc5178\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021505 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql9fm\" (UniqueName: \"kubernetes.io/projected/a259660c-b57b-4a89-9f33-19d3bb3f5a93-kube-api-access-ql9fm\") pod \"collect-profiles-29407110-d75hh\" (UID: \"a259660c-b57b-4a89-9f33-19d3bb3f5a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021548 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d7892f-1c0b-4106-b34d-b309f9c60807-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45ssg\" (UID: \"c1d7892f-1c0b-4106-b34d-b309f9c60807\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021564 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j2hf\" (UniqueName: \"kubernetes.io/projected/c3efd626-bb2f-4257-ac3a-fbf6d7a76780-kube-api-access-5j2hf\") pod \"packageserver-d55dfcdfc-54x97\" (UID: \"c3efd626-bb2f-4257-ac3a-fbf6d7a76780\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021614 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c5dabfe-62e3-4104-9939-59e4832c6484-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-75hhz\" (UID: \"0c5dabfe-62e3-4104-9939-59e4832c6484\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75hhz" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021634 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f45e1ce1-c477-4a87-a3ab-821e702ce490-default-certificate\") pod \"router-default-5444994796-dwtxw\" (UID: \"f45e1ce1-c477-4a87-a3ab-821e702ce490\") " pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021663 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6twp7\" (UniqueName: \"kubernetes.io/projected/26174366-e2ac-42d4-b0aa-ce6dd5e50ade-kube-api-access-6twp7\") pod \"machine-config-operator-74547568cd-2l24g\" (UID: \"26174366-e2ac-42d4-b0aa-ce6dd5e50ade\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021680 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqwtt\" (UniqueName: \"kubernetes.io/projected/fdc08015-ab62-4a36-a1f6-e17514ac47dd-kube-api-access-sqwtt\") pod \"multus-admission-controller-857f4d67dd-dkwzk\" (UID: \"fdc08015-ab62-4a36-a1f6-e17514ac47dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dkwzk" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021700 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bklxl\" (UniqueName: \"kubernetes.io/projected/429c8834-660f-4631-889b-f00c5f79f30a-kube-api-access-bklxl\") pod \"package-server-manager-789f6589d5-vjb5n\" (UID: \"429c8834-660f-4631-889b-f00c5f79f30a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021733 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a83e8666-c5ff-45bf-a1ed-584a2554e6aa-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mgntg\" (UID: \"a83e8666-c5ff-45bf-a1ed-584a2554e6aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021773 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfk7d\" (UniqueName: \"kubernetes.io/projected/ac45e4d4-e19e-49d8-a1da-1c27442e7da7-kube-api-access-dfk7d\") pod \"ingress-canary-jrw7z\" (UID: \"ac45e4d4-e19e-49d8-a1da-1c27442e7da7\") " pod="openshift-ingress-canary/ingress-canary-jrw7z" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021792 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf8dba76-208a-4a46-8f25-50beb0d28ae6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kd4xm\" (UID: \"cf8dba76-208a-4a46-8f25-50beb0d28ae6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021810 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c3efd626-bb2f-4257-ac3a-fbf6d7a76780-webhook-cert\") pod \"packageserver-d55dfcdfc-54x97\" (UID: \"c3efd626-bb2f-4257-ac3a-fbf6d7a76780\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021826 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl7zz\" (UniqueName: \"kubernetes.io/projected/2bd0fc3f-a365-48e4-bbe2-1509a739460a-kube-api-access-nl7zz\") pod \"openshift-config-operator-7777fb866f-mvnwh\" (UID: \"2bd0fc3f-a365-48e4-bbe2-1509a739460a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021844 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/429c8834-660f-4631-889b-f00c5f79f30a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vjb5n\" (UID: \"429c8834-660f-4631-889b-f00c5f79f30a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021876 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-registry-tls\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021892 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjdt6\" (UniqueName: \"kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-kube-api-access-gjdt6\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021912 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac45e4d4-e19e-49d8-a1da-1c27442e7da7-cert\") pod \"ingress-canary-jrw7z\" (UID: \"ac45e4d4-e19e-49d8-a1da-1c27442e7da7\") " pod="openshift-ingress-canary/ingress-canary-jrw7z" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021929 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/00ad88f1-cc82-4e00-8b0a-ddb548db43fa-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vw9j7\" (UID: \"00ad88f1-cc82-4e00-8b0a-ddb548db43fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021947 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c67f836d-7f9f-4827-a8d5-0f3934f96b14-proxy-tls\") pod \"machine-config-controller-84d6567774-t4lwh\" (UID: \"c67f836d-7f9f-4827-a8d5-0f3934f96b14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021963 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z7wg\" (UniqueName: \"kubernetes.io/projected/0c5dabfe-62e3-4104-9939-59e4832c6484-kube-api-access-8z7wg\") pod \"control-plane-machine-set-operator-78cbb6b69f-75hhz\" (UID: \"0c5dabfe-62e3-4104-9939-59e4832c6484\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75hhz" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021980 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gkbm\" (UniqueName: \"kubernetes.io/projected/a83e8666-c5ff-45bf-a1ed-584a2554e6aa-kube-api-access-6gkbm\") pod \"authentication-operator-69f744f599-mgntg\" (UID: \"a83e8666-c5ff-45bf-a1ed-584a2554e6aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.021998 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rmkd\" (UniqueName: \"kubernetes.io/projected/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-kube-api-access-2rmkd\") pod \"marketplace-operator-79b997595-qtr2r\" (UID: \"2c98c8e5-b9f1-43ca-93f6-cb74695dd076\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.022087 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a83e8666-c5ff-45bf-a1ed-584a2554e6aa-serving-cert\") pod \"authentication-operator-69f744f599-mgntg\" (UID: \"a83e8666-c5ff-45bf-a1ed-584a2554e6aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.022139 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/26174366-e2ac-42d4-b0aa-ce6dd5e50ade-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2l24g\" (UID: \"26174366-e2ac-42d4-b0aa-ce6dd5e50ade\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.022168 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qtr2r\" (UID: \"2c98c8e5-b9f1-43ca-93f6-cb74695dd076\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" Nov 29 14:30:50 crc kubenswrapper[4907]: E1129 14:30:50.024023 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:50.523903468 +0000 UTC m=+148.510741120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.058539 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rthjj" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.060412 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.065718 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.079000 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qdmt6"] Nov 29 14:30:50 crc kubenswrapper[4907]: W1129 14:30:50.088833 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-5d9e6e59046f426d2fac41b12af942a254e74c114021eb21aeb575f1cf0f5fa0 WatchSource:0}: Error finding container 5d9e6e59046f426d2fac41b12af942a254e74c114021eb21aeb575f1cf0f5fa0: Status 404 returned error can't find the container with id 5d9e6e59046f426d2fac41b12af942a254e74c114021eb21aeb575f1cf0f5fa0 Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.090608 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c572b"] Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.099869 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm"] Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.111236 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-j46xp" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.122950 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123164 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a83e8666-c5ff-45bf-a1ed-584a2554e6aa-service-ca-bundle\") pod \"authentication-operator-69f744f599-mgntg\" (UID: \"a83e8666-c5ff-45bf-a1ed-584a2554e6aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123184 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: E1129 14:30:50.123226 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:50.623209763 +0000 UTC m=+148.610047415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123246 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-bound-sa-token\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123272 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f45e1ce1-c477-4a87-a3ab-821e702ce490-stats-auth\") pod \"router-default-5444994796-dwtxw\" (UID: \"f45e1ce1-c477-4a87-a3ab-821e702ce490\") " pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123287 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-trusted-ca\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123302 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2bd0fc3f-a365-48e4-bbe2-1509a739460a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mvnwh\" (UID: \"2bd0fc3f-a365-48e4-bbe2-1509a739460a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123317 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlmxl\" (UniqueName: \"kubernetes.io/projected/f45e1ce1-c477-4a87-a3ab-821e702ce490-kube-api-access-vlmxl\") pod \"router-default-5444994796-dwtxw\" (UID: \"f45e1ce1-c477-4a87-a3ab-821e702ce490\") " pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123332 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rql7s\" (UniqueName: \"kubernetes.io/projected/cf8dba76-208a-4a46-8f25-50beb0d28ae6-kube-api-access-rql7s\") pod \"kube-storage-version-migrator-operator-b67b599dd-kd4xm\" (UID: \"cf8dba76-208a-4a46-8f25-50beb0d28ae6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123349 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fdc08015-ab62-4a36-a1f6-e17514ac47dd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dkwzk\" (UID: \"fdc08015-ab62-4a36-a1f6-e17514ac47dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dkwzk" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123363 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4dl5\" (UniqueName: \"kubernetes.io/projected/c67f836d-7f9f-4827-a8d5-0f3934f96b14-kube-api-access-v4dl5\") pod \"machine-config-controller-84d6567774-t4lwh\" (UID: \"c67f836d-7f9f-4827-a8d5-0f3934f96b14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123379 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qtr2r\" (UID: \"2c98c8e5-b9f1-43ca-93f6-cb74695dd076\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123394 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1d7892f-1c0b-4106-b34d-b309f9c60807-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45ssg\" (UID: \"c1d7892f-1c0b-4106-b34d-b309f9c60807\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123409 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfd7r\" (UniqueName: \"kubernetes.io/projected/d36c3ddb-8f3c-477d-bf22-1e336519a83e-kube-api-access-tfd7r\") pod \"migrator-59844c95c7-hc9t9\" (UID: \"d36c3ddb-8f3c-477d-bf22-1e336519a83e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hc9t9" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123459 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-registry-certificates\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123475 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123489 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c8db7d8-f286-4ca7-81ad-76198fcc5178-srv-cert\") pod \"catalog-operator-68c6474976-f54qw\" (UID: \"6c8db7d8-f286-4ca7-81ad-76198fcc5178\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123514 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a259660c-b57b-4a89-9f33-19d3bb3f5a93-config-volume\") pod \"collect-profiles-29407110-d75hh\" (UID: \"a259660c-b57b-4a89-9f33-19d3bb3f5a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123535 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c630db4e-b02b-480e-bfe7-c93188434b14-node-bootstrap-token\") pod \"machine-config-server-2j64h\" (UID: \"c630db4e-b02b-480e-bfe7-c93188434b14\") " pod="openshift-machine-config-operator/machine-config-server-2j64h" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123555 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c3efd626-bb2f-4257-ac3a-fbf6d7a76780-tmpfs\") pod \"packageserver-d55dfcdfc-54x97\" (UID: \"c3efd626-bb2f-4257-ac3a-fbf6d7a76780\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123574 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/be668353-b5df-42f6-bc63-9e896be4f7e7-plugins-dir\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123603 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/be668353-b5df-42f6-bc63-9e896be4f7e7-registration-dir\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123619 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn7qr\" (UniqueName: \"kubernetes.io/projected/be668353-b5df-42f6-bc63-9e896be4f7e7-kube-api-access-cn7qr\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123634 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/be668353-b5df-42f6-bc63-9e896be4f7e7-csi-data-dir\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123654 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c3efd626-bb2f-4257-ac3a-fbf6d7a76780-apiservice-cert\") pod \"packageserver-d55dfcdfc-54x97\" (UID: \"c3efd626-bb2f-4257-ac3a-fbf6d7a76780\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123669 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/26174366-e2ac-42d4-b0aa-ce6dd5e50ade-images\") pod \"machine-config-operator-74547568cd-2l24g\" (UID: \"26174366-e2ac-42d4-b0aa-ce6dd5e50ade\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123689 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/00ad88f1-cc82-4e00-8b0a-ddb548db43fa-srv-cert\") pod \"olm-operator-6b444d44fb-vw9j7\" (UID: \"00ad88f1-cc82-4e00-8b0a-ddb548db43fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123722 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhdk2\" (UniqueName: \"kubernetes.io/projected/00ad88f1-cc82-4e00-8b0a-ddb548db43fa-kube-api-access-nhdk2\") pod \"olm-operator-6b444d44fb-vw9j7\" (UID: \"00ad88f1-cc82-4e00-8b0a-ddb548db43fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123752 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123769 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/595bcc18-9208-4034-b70d-be547bd61aa0-config-volume\") pod \"dns-default-jxwmf\" (UID: \"595bcc18-9208-4034-b70d-be547bd61aa0\") " pod="openshift-dns/dns-default-jxwmf" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123794 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a83e8666-c5ff-45bf-a1ed-584a2554e6aa-config\") pod \"authentication-operator-69f744f599-mgntg\" (UID: \"a83e8666-c5ff-45bf-a1ed-584a2554e6aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123818 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26174366-e2ac-42d4-b0aa-ce6dd5e50ade-proxy-tls\") pod \"machine-config-operator-74547568cd-2l24g\" (UID: \"26174366-e2ac-42d4-b0aa-ce6dd5e50ade\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123836 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6c8db7d8-f286-4ca7-81ad-76198fcc5178-profile-collector-cert\") pod \"catalog-operator-68c6474976-f54qw\" (UID: \"6c8db7d8-f286-4ca7-81ad-76198fcc5178\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123851 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vck5q\" (UniqueName: \"kubernetes.io/projected/6c8db7d8-f286-4ca7-81ad-76198fcc5178-kube-api-access-vck5q\") pod \"catalog-operator-68c6474976-f54qw\" (UID: \"6c8db7d8-f286-4ca7-81ad-76198fcc5178\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123866 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql9fm\" (UniqueName: \"kubernetes.io/projected/a259660c-b57b-4a89-9f33-19d3bb3f5a93-kube-api-access-ql9fm\") pod \"collect-profiles-29407110-d75hh\" (UID: \"a259660c-b57b-4a89-9f33-19d3bb3f5a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123868 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123881 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d7892f-1c0b-4106-b34d-b309f9c60807-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45ssg\" (UID: \"c1d7892f-1c0b-4106-b34d-b309f9c60807\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123952 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j2hf\" (UniqueName: \"kubernetes.io/projected/c3efd626-bb2f-4257-ac3a-fbf6d7a76780-kube-api-access-5j2hf\") pod \"packageserver-d55dfcdfc-54x97\" (UID: \"c3efd626-bb2f-4257-ac3a-fbf6d7a76780\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.123983 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/be668353-b5df-42f6-bc63-9e896be4f7e7-socket-dir\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124015 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c5dabfe-62e3-4104-9939-59e4832c6484-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-75hhz\" (UID: \"0c5dabfe-62e3-4104-9939-59e4832c6484\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75hhz" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124036 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f45e1ce1-c477-4a87-a3ab-821e702ce490-default-certificate\") pod \"router-default-5444994796-dwtxw\" (UID: \"f45e1ce1-c477-4a87-a3ab-821e702ce490\") " pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124084 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6twp7\" (UniqueName: \"kubernetes.io/projected/26174366-e2ac-42d4-b0aa-ce6dd5e50ade-kube-api-access-6twp7\") pod \"machine-config-operator-74547568cd-2l24g\" (UID: \"26174366-e2ac-42d4-b0aa-ce6dd5e50ade\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124105 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqwtt\" (UniqueName: \"kubernetes.io/projected/fdc08015-ab62-4a36-a1f6-e17514ac47dd-kube-api-access-sqwtt\") pod \"multus-admission-controller-857f4d67dd-dkwzk\" (UID: \"fdc08015-ab62-4a36-a1f6-e17514ac47dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dkwzk" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124126 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlthh\" (UniqueName: \"kubernetes.io/projected/c630db4e-b02b-480e-bfe7-c93188434b14-kube-api-access-wlthh\") pod \"machine-config-server-2j64h\" (UID: \"c630db4e-b02b-480e-bfe7-c93188434b14\") " pod="openshift-machine-config-operator/machine-config-server-2j64h" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124161 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bklxl\" (UniqueName: \"kubernetes.io/projected/429c8834-660f-4631-889b-f00c5f79f30a-kube-api-access-bklxl\") pod \"package-server-manager-789f6589d5-vjb5n\" (UID: \"429c8834-660f-4631-889b-f00c5f79f30a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124188 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a83e8666-c5ff-45bf-a1ed-584a2554e6aa-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mgntg\" (UID: \"a83e8666-c5ff-45bf-a1ed-584a2554e6aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124206 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfk7d\" (UniqueName: \"kubernetes.io/projected/ac45e4d4-e19e-49d8-a1da-1c27442e7da7-kube-api-access-dfk7d\") pod \"ingress-canary-jrw7z\" (UID: \"ac45e4d4-e19e-49d8-a1da-1c27442e7da7\") " pod="openshift-ingress-canary/ingress-canary-jrw7z" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124223 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/be668353-b5df-42f6-bc63-9e896be4f7e7-mountpoint-dir\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124251 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf8dba76-208a-4a46-8f25-50beb0d28ae6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kd4xm\" (UID: \"cf8dba76-208a-4a46-8f25-50beb0d28ae6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124267 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c3efd626-bb2f-4257-ac3a-fbf6d7a76780-webhook-cert\") pod \"packageserver-d55dfcdfc-54x97\" (UID: \"c3efd626-bb2f-4257-ac3a-fbf6d7a76780\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124286 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl7zz\" (UniqueName: \"kubernetes.io/projected/2bd0fc3f-a365-48e4-bbe2-1509a739460a-kube-api-access-nl7zz\") pod \"openshift-config-operator-7777fb866f-mvnwh\" (UID: \"2bd0fc3f-a365-48e4-bbe2-1509a739460a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124304 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/429c8834-660f-4631-889b-f00c5f79f30a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vjb5n\" (UID: \"429c8834-660f-4631-889b-f00c5f79f30a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124322 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c630db4e-b02b-480e-bfe7-c93188434b14-certs\") pod \"machine-config-server-2j64h\" (UID: \"c630db4e-b02b-480e-bfe7-c93188434b14\") " pod="openshift-machine-config-operator/machine-config-server-2j64h" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124348 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-registry-tls\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124364 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjdt6\" (UniqueName: \"kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-kube-api-access-gjdt6\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124380 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac45e4d4-e19e-49d8-a1da-1c27442e7da7-cert\") pod \"ingress-canary-jrw7z\" (UID: \"ac45e4d4-e19e-49d8-a1da-1c27442e7da7\") " pod="openshift-ingress-canary/ingress-canary-jrw7z" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124397 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/00ad88f1-cc82-4e00-8b0a-ddb548db43fa-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vw9j7\" (UID: \"00ad88f1-cc82-4e00-8b0a-ddb548db43fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124399 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d7892f-1c0b-4106-b34d-b309f9c60807-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45ssg\" (UID: \"c1d7892f-1c0b-4106-b34d-b309f9c60807\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124415 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c67f836d-7f9f-4827-a8d5-0f3934f96b14-proxy-tls\") pod \"machine-config-controller-84d6567774-t4lwh\" (UID: \"c67f836d-7f9f-4827-a8d5-0f3934f96b14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124481 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gkbm\" (UniqueName: \"kubernetes.io/projected/a83e8666-c5ff-45bf-a1ed-584a2554e6aa-kube-api-access-6gkbm\") pod \"authentication-operator-69f744f599-mgntg\" (UID: \"a83e8666-c5ff-45bf-a1ed-584a2554e6aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124506 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z7wg\" (UniqueName: \"kubernetes.io/projected/0c5dabfe-62e3-4104-9939-59e4832c6484-kube-api-access-8z7wg\") pod \"control-plane-machine-set-operator-78cbb6b69f-75hhz\" (UID: \"0c5dabfe-62e3-4104-9939-59e4832c6484\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75hhz" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124757 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rmkd\" (UniqueName: \"kubernetes.io/projected/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-kube-api-access-2rmkd\") pod \"marketplace-operator-79b997595-qtr2r\" (UID: \"2c98c8e5-b9f1-43ca-93f6-cb74695dd076\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124798 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a83e8666-c5ff-45bf-a1ed-584a2554e6aa-serving-cert\") pod \"authentication-operator-69f744f599-mgntg\" (UID: \"a83e8666-c5ff-45bf-a1ed-584a2554e6aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124824 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/26174366-e2ac-42d4-b0aa-ce6dd5e50ade-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2l24g\" (UID: \"26174366-e2ac-42d4-b0aa-ce6dd5e50ade\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124849 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qtr2r\" (UID: \"2c98c8e5-b9f1-43ca-93f6-cb74695dd076\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124876 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/595bcc18-9208-4034-b70d-be547bd61aa0-metrics-tls\") pod \"dns-default-jxwmf\" (UID: \"595bcc18-9208-4034-b70d-be547bd61aa0\") " pod="openshift-dns/dns-default-jxwmf" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124897 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn9lj\" (UniqueName: \"kubernetes.io/projected/595bcc18-9208-4034-b70d-be547bd61aa0-kube-api-access-xn9lj\") pod \"dns-default-jxwmf\" (UID: \"595bcc18-9208-4034-b70d-be547bd61aa0\") " pod="openshift-dns/dns-default-jxwmf" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124918 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f45e1ce1-c477-4a87-a3ab-821e702ce490-service-ca-bundle\") pod \"router-default-5444994796-dwtxw\" (UID: \"f45e1ce1-c477-4a87-a3ab-821e702ce490\") " pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124934 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bd0fc3f-a365-48e4-bbe2-1509a739460a-serving-cert\") pod \"openshift-config-operator-7777fb866f-mvnwh\" (UID: \"2bd0fc3f-a365-48e4-bbe2-1509a739460a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124955 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a259660c-b57b-4a89-9f33-19d3bb3f5a93-secret-volume\") pod \"collect-profiles-29407110-d75hh\" (UID: \"a259660c-b57b-4a89-9f33-19d3bb3f5a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124972 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8dba76-208a-4a46-8f25-50beb0d28ae6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kd4xm\" (UID: \"cf8dba76-208a-4a46-8f25-50beb0d28ae6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.124989 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1d7892f-1c0b-4106-b34d-b309f9c60807-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45ssg\" (UID: \"c1d7892f-1c0b-4106-b34d-b309f9c60807\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.125012 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f45e1ce1-c477-4a87-a3ab-821e702ce490-metrics-certs\") pod \"router-default-5444994796-dwtxw\" (UID: \"f45e1ce1-c477-4a87-a3ab-821e702ce490\") " pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.125029 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c67f836d-7f9f-4827-a8d5-0f3934f96b14-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-t4lwh\" (UID: \"c67f836d-7f9f-4827-a8d5-0f3934f96b14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.125781 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c67f836d-7f9f-4827-a8d5-0f3934f96b14-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-t4lwh\" (UID: \"c67f836d-7f9f-4827-a8d5-0f3934f96b14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.125845 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a83e8666-c5ff-45bf-a1ed-584a2554e6aa-service-ca-bundle\") pod \"authentication-operator-69f744f599-mgntg\" (UID: \"a83e8666-c5ff-45bf-a1ed-584a2554e6aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.128818 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-trusted-ca\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.129138 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2bd0fc3f-a365-48e4-bbe2-1509a739460a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mvnwh\" (UID: \"2bd0fc3f-a365-48e4-bbe2-1509a739460a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.133341 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac45e4d4-e19e-49d8-a1da-1c27442e7da7-cert\") pod \"ingress-canary-jrw7z\" (UID: \"ac45e4d4-e19e-49d8-a1da-1c27442e7da7\") " pod="openshift-ingress-canary/ingress-canary-jrw7z" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.134087 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c5dabfe-62e3-4104-9939-59e4832c6484-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-75hhz\" (UID: \"0c5dabfe-62e3-4104-9939-59e4832c6484\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75hhz" Nov 29 14:30:50 crc kubenswrapper[4907]: E1129 14:30:50.138000 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:50.637987231 +0000 UTC m=+148.624824873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.141488 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a83e8666-c5ff-45bf-a1ed-584a2554e6aa-config\") pod \"authentication-operator-69f744f599-mgntg\" (UID: \"a83e8666-c5ff-45bf-a1ed-584a2554e6aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.145073 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f45e1ce1-c477-4a87-a3ab-821e702ce490-stats-auth\") pod \"router-default-5444994796-dwtxw\" (UID: \"f45e1ce1-c477-4a87-a3ab-821e702ce490\") " pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.146618 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a83e8666-c5ff-45bf-a1ed-584a2554e6aa-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mgntg\" (UID: \"a83e8666-c5ff-45bf-a1ed-584a2554e6aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.147448 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8dba76-208a-4a46-8f25-50beb0d28ae6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kd4xm\" (UID: \"cf8dba76-208a-4a46-8f25-50beb0d28ae6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.148135 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/26174366-e2ac-42d4-b0aa-ce6dd5e50ade-images\") pod \"machine-config-operator-74547568cd-2l24g\" (UID: \"26174366-e2ac-42d4-b0aa-ce6dd5e50ade\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.150888 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f45e1ce1-c477-4a87-a3ab-821e702ce490-service-ca-bundle\") pod \"router-default-5444994796-dwtxw\" (UID: \"f45e1ce1-c477-4a87-a3ab-821e702ce490\") " pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.151229 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fdc08015-ab62-4a36-a1f6-e17514ac47dd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dkwzk\" (UID: \"fdc08015-ab62-4a36-a1f6-e17514ac47dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dkwzk" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.153057 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/00ad88f1-cc82-4e00-8b0a-ddb548db43fa-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vw9j7\" (UID: \"00ad88f1-cc82-4e00-8b0a-ddb548db43fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.153161 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a259660c-b57b-4a89-9f33-19d3bb3f5a93-secret-volume\") pod \"collect-profiles-29407110-d75hh\" (UID: \"a259660c-b57b-4a89-9f33-19d3bb3f5a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.155994 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26174366-e2ac-42d4-b0aa-ce6dd5e50ade-proxy-tls\") pod \"machine-config-operator-74547568cd-2l24g\" (UID: \"26174366-e2ac-42d4-b0aa-ce6dd5e50ade\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.157792 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/26174366-e2ac-42d4-b0aa-ce6dd5e50ade-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2l24g\" (UID: \"26174366-e2ac-42d4-b0aa-ce6dd5e50ade\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.158656 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c3efd626-bb2f-4257-ac3a-fbf6d7a76780-apiservice-cert\") pod \"packageserver-d55dfcdfc-54x97\" (UID: \"c3efd626-bb2f-4257-ac3a-fbf6d7a76780\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.159544 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a259660c-b57b-4a89-9f33-19d3bb3f5a93-config-volume\") pod \"collect-profiles-29407110-d75hh\" (UID: \"a259660c-b57b-4a89-9f33-19d3bb3f5a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.159997 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qtr2r\" (UID: \"2c98c8e5-b9f1-43ca-93f6-cb74695dd076\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.160301 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c3efd626-bb2f-4257-ac3a-fbf6d7a76780-tmpfs\") pod \"packageserver-d55dfcdfc-54x97\" (UID: \"c3efd626-bb2f-4257-ac3a-fbf6d7a76780\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.160817 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6c8db7d8-f286-4ca7-81ad-76198fcc5178-profile-collector-cert\") pod \"catalog-operator-68c6474976-f54qw\" (UID: \"6c8db7d8-f286-4ca7-81ad-76198fcc5178\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.167990 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.170634 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-registry-certificates\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.172004 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c8db7d8-f286-4ca7-81ad-76198fcc5178-srv-cert\") pod \"catalog-operator-68c6474976-f54qw\" (UID: \"6c8db7d8-f286-4ca7-81ad-76198fcc5178\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.173081 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c67f836d-7f9f-4827-a8d5-0f3934f96b14-proxy-tls\") pod \"machine-config-controller-84d6567774-t4lwh\" (UID: \"c67f836d-7f9f-4827-a8d5-0f3934f96b14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.173616 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.174604 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/429c8834-660f-4631-889b-f00c5f79f30a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vjb5n\" (UID: \"429c8834-660f-4631-889b-f00c5f79f30a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.176983 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bd0fc3f-a365-48e4-bbe2-1509a739460a-serving-cert\") pod \"openshift-config-operator-7777fb866f-mvnwh\" (UID: \"2bd0fc3f-a365-48e4-bbe2-1509a739460a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.177086 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f45e1ce1-c477-4a87-a3ab-821e702ce490-metrics-certs\") pod \"router-default-5444994796-dwtxw\" (UID: \"f45e1ce1-c477-4a87-a3ab-821e702ce490\") " pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.179261 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-bound-sa-token\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.180026 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a83e8666-c5ff-45bf-a1ed-584a2554e6aa-serving-cert\") pod \"authentication-operator-69f744f599-mgntg\" (UID: \"a83e8666-c5ff-45bf-a1ed-584a2554e6aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.182520 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f45e1ce1-c477-4a87-a3ab-821e702ce490-default-certificate\") pod \"router-default-5444994796-dwtxw\" (UID: \"f45e1ce1-c477-4a87-a3ab-821e702ce490\") " pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.184043 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qtr2r\" (UID: \"2c98c8e5-b9f1-43ca-93f6-cb74695dd076\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.186961 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-registry-tls\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.188901 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf8dba76-208a-4a46-8f25-50beb0d28ae6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kd4xm\" (UID: \"cf8dba76-208a-4a46-8f25-50beb0d28ae6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.192043 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/00ad88f1-cc82-4e00-8b0a-ddb548db43fa-srv-cert\") pod \"olm-operator-6b444d44fb-vw9j7\" (UID: \"00ad88f1-cc82-4e00-8b0a-ddb548db43fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.192201 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c3efd626-bb2f-4257-ac3a-fbf6d7a76780-webhook-cert\") pod \"packageserver-d55dfcdfc-54x97\" (UID: \"c3efd626-bb2f-4257-ac3a-fbf6d7a76780\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.192676 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j2hf\" (UniqueName: \"kubernetes.io/projected/c3efd626-bb2f-4257-ac3a-fbf6d7a76780-kube-api-access-5j2hf\") pod \"packageserver-d55dfcdfc-54x97\" (UID: \"c3efd626-bb2f-4257-ac3a-fbf6d7a76780\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.192895 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1d7892f-1c0b-4106-b34d-b309f9c60807-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45ssg\" (UID: \"c1d7892f-1c0b-4106-b34d-b309f9c60807\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg" Nov 29 14:30:50 crc kubenswrapper[4907]: W1129 14:30:50.200167 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2417b89a_c4ec_46b8_9e9f_95ec8cd1fbef.slice/crio-994809daffc16bb903bd7296894afdb5905242d898fc2cba8d20d0b200013918 WatchSource:0}: Error finding container 994809daffc16bb903bd7296894afdb5905242d898fc2cba8d20d0b200013918: Status 404 returned error can't find the container with id 994809daffc16bb903bd7296894afdb5905242d898fc2cba8d20d0b200013918 Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.212146 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlmxl\" (UniqueName: \"kubernetes.io/projected/f45e1ce1-c477-4a87-a3ab-821e702ce490-kube-api-access-vlmxl\") pod \"router-default-5444994796-dwtxw\" (UID: \"f45e1ce1-c477-4a87-a3ab-821e702ce490\") " pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.215543 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.225729 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.225917 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c630db4e-b02b-480e-bfe7-c93188434b14-certs\") pod \"machine-config-server-2j64h\" (UID: \"c630db4e-b02b-480e-bfe7-c93188434b14\") " pod="openshift-machine-config-operator/machine-config-server-2j64h" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.225975 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/595bcc18-9208-4034-b70d-be547bd61aa0-metrics-tls\") pod \"dns-default-jxwmf\" (UID: \"595bcc18-9208-4034-b70d-be547bd61aa0\") " pod="openshift-dns/dns-default-jxwmf" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.225993 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn9lj\" (UniqueName: \"kubernetes.io/projected/595bcc18-9208-4034-b70d-be547bd61aa0-kube-api-access-xn9lj\") pod \"dns-default-jxwmf\" (UID: \"595bcc18-9208-4034-b70d-be547bd61aa0\") " pod="openshift-dns/dns-default-jxwmf" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.226055 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c630db4e-b02b-480e-bfe7-c93188434b14-node-bootstrap-token\") pod \"machine-config-server-2j64h\" (UID: \"c630db4e-b02b-480e-bfe7-c93188434b14\") " pod="openshift-machine-config-operator/machine-config-server-2j64h" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.226073 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/be668353-b5df-42f6-bc63-9e896be4f7e7-plugins-dir\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.226090 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/be668353-b5df-42f6-bc63-9e896be4f7e7-registration-dir\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.226109 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn7qr\" (UniqueName: \"kubernetes.io/projected/be668353-b5df-42f6-bc63-9e896be4f7e7-kube-api-access-cn7qr\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.226124 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/be668353-b5df-42f6-bc63-9e896be4f7e7-csi-data-dir\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.226167 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/595bcc18-9208-4034-b70d-be547bd61aa0-config-volume\") pod \"dns-default-jxwmf\" (UID: \"595bcc18-9208-4034-b70d-be547bd61aa0\") " pod="openshift-dns/dns-default-jxwmf" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.226201 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/be668353-b5df-42f6-bc63-9e896be4f7e7-socket-dir\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.226231 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlthh\" (UniqueName: \"kubernetes.io/projected/c630db4e-b02b-480e-bfe7-c93188434b14-kube-api-access-wlthh\") pod \"machine-config-server-2j64h\" (UID: \"c630db4e-b02b-480e-bfe7-c93188434b14\") " pod="openshift-machine-config-operator/machine-config-server-2j64h" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.226261 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/be668353-b5df-42f6-bc63-9e896be4f7e7-mountpoint-dir\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.226343 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/be668353-b5df-42f6-bc63-9e896be4f7e7-mountpoint-dir\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: E1129 14:30:50.226416 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:50.726399156 +0000 UTC m=+148.713236808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.232551 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/be668353-b5df-42f6-bc63-9e896be4f7e7-registration-dir\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.232645 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/be668353-b5df-42f6-bc63-9e896be4f7e7-plugins-dir\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.232685 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/be668353-b5df-42f6-bc63-9e896be4f7e7-socket-dir\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.232824 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/be668353-b5df-42f6-bc63-9e896be4f7e7-csi-data-dir\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.233010 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/595bcc18-9208-4034-b70d-be547bd61aa0-config-volume\") pod \"dns-default-jxwmf\" (UID: \"595bcc18-9208-4034-b70d-be547bd61aa0\") " pod="openshift-dns/dns-default-jxwmf" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.238117 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c630db4e-b02b-480e-bfe7-c93188434b14-node-bootstrap-token\") pod \"machine-config-server-2j64h\" (UID: \"c630db4e-b02b-480e-bfe7-c93188434b14\") " pod="openshift-machine-config-operator/machine-config-server-2j64h" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.246871 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.252545 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c630db4e-b02b-480e-bfe7-c93188434b14-certs\") pod \"machine-config-server-2j64h\" (UID: \"c630db4e-b02b-480e-bfe7-c93188434b14\") " pod="openshift-machine-config-operator/machine-config-server-2j64h" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.260835 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rql7s\" (UniqueName: \"kubernetes.io/projected/cf8dba76-208a-4a46-8f25-50beb0d28ae6-kube-api-access-rql7s\") pod \"kube-storage-version-migrator-operator-b67b599dd-kd4xm\" (UID: \"cf8dba76-208a-4a46-8f25-50beb0d28ae6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.260898 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/595bcc18-9208-4034-b70d-be547bd61aa0-metrics-tls\") pod \"dns-default-jxwmf\" (UID: \"595bcc18-9208-4034-b70d-be547bd61aa0\") " pod="openshift-dns/dns-default-jxwmf" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.271986 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vck5q\" (UniqueName: \"kubernetes.io/projected/6c8db7d8-f286-4ca7-81ad-76198fcc5178-kube-api-access-vck5q\") pod \"catalog-operator-68c6474976-f54qw\" (UID: \"6c8db7d8-f286-4ca7-81ad-76198fcc5178\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.287140 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhdk2\" (UniqueName: \"kubernetes.io/projected/00ad88f1-cc82-4e00-8b0a-ddb548db43fa-kube-api-access-nhdk2\") pod \"olm-operator-6b444d44fb-vw9j7\" (UID: \"00ad88f1-cc82-4e00-8b0a-ddb548db43fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.289409 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6twp7\" (UniqueName: \"kubernetes.io/projected/26174366-e2ac-42d4-b0aa-ce6dd5e50ade-kube-api-access-6twp7\") pod \"machine-config-operator-74547568cd-2l24g\" (UID: \"26174366-e2ac-42d4-b0aa-ce6dd5e50ade\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.316136 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqwtt\" (UniqueName: \"kubernetes.io/projected/fdc08015-ab62-4a36-a1f6-e17514ac47dd-kube-api-access-sqwtt\") pod \"multus-admission-controller-857f4d67dd-dkwzk\" (UID: \"fdc08015-ab62-4a36-a1f6-e17514ac47dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dkwzk" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.325455 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pm8bc"] Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.327032 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: E1129 14:30:50.327423 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:50.827412 +0000 UTC m=+148.814249652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.344782 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bklxl\" (UniqueName: \"kubernetes.io/projected/429c8834-660f-4631-889b-f00c5f79f30a-kube-api-access-bklxl\") pod \"package-server-manager-789f6589d5-vjb5n\" (UID: \"429c8834-660f-4631-889b-f00c5f79f30a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.350451 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" event={"ID":"73987028-5c99-42b5-b871-213b94cd4826","Type":"ContainerStarted","Data":"d8c75a55e6c53fae4e1fee4dd6eed2f03d9b9a8adf1b22161871b961f1d86ec2"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.350498 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" event={"ID":"73987028-5c99-42b5-b871-213b94cd4826","Type":"ContainerStarted","Data":"b21ad5cc8dcedc343a02ee5aba77346d88767d9a4df43283bee7b54819477576"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.351330 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c3eeb5449416876642e496c3702c626679bcf3d17a049c48de341893e47a5046"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.352381 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" event={"ID":"7a422846-a8dc-47ce-912d-6444fb22b575","Type":"ContainerStarted","Data":"82c8bfbc63ba16ff25ddd9a525400a18211a9c7b8f2457e95209df8bf2b5c111"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.353310 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" event={"ID":"ec75ca64-71c8-4a23-8723-378e0450548b","Type":"ContainerStarted","Data":"b5571c67cc5d0b17c2278389b35e055b6529696bc3bf78261f2df64a02929504"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.354623 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gkbm\" (UniqueName: \"kubernetes.io/projected/a83e8666-c5ff-45bf-a1ed-584a2554e6aa-kube-api-access-6gkbm\") pod \"authentication-operator-69f744f599-mgntg\" (UID: \"a83e8666-c5ff-45bf-a1ed-584a2554e6aa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.356298 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh" event={"ID":"ccf1336b-0e9f-42d7-adca-45e017360773","Type":"ContainerStarted","Data":"6ad451664396489b6e900dc53a0661e68bc858f1a2d20d180681c4c6972f8adf"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.357774 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm" event={"ID":"2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef","Type":"ContainerStarted","Data":"994809daffc16bb903bd7296894afdb5905242d898fc2cba8d20d0b200013918"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.358704 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5t44g" event={"ID":"0c70da9a-ff96-432f-81ad-382c70754e70","Type":"ContainerStarted","Data":"7e8a2351b792551a738eeeb169cd93621d9652f6e39873e6c447a35fefb1da6d"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.358731 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5t44g" event={"ID":"0c70da9a-ff96-432f-81ad-382c70754e70","Type":"ContainerStarted","Data":"07c8cf20bbec9cd35ce2496b5a307d905e7de091190a7992d5c49f33ddcc2dd3"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.367725 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5d9e6e59046f426d2fac41b12af942a254e74c114021eb21aeb575f1cf0f5fa0"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.369712 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z7wg\" (UniqueName: \"kubernetes.io/projected/0c5dabfe-62e3-4104-9939-59e4832c6484-kube-api-access-8z7wg\") pod \"control-plane-machine-set-operator-78cbb6b69f-75hhz\" (UID: \"0c5dabfe-62e3-4104-9939-59e4832c6484\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75hhz" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.379243 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" event={"ID":"72fe0643-91a8-459e-aec7-257e5b07ea41","Type":"ContainerStarted","Data":"d222cdfcde1854b9c7aca914be13e2f5f353ed421a3d6d984dfa6640cb6aaa69"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.384304 4907 generic.go:334] "Generic (PLEG): container finished" podID="ad127416-8f9c-4d2d-a13a-8a0b69525847" containerID="8dc85bf58cf79a6bf9bd642d3c927578da0c2c2f595e83149b64a874d24cbdf8" exitCode=0 Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.384612 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" event={"ID":"ad127416-8f9c-4d2d-a13a-8a0b69525847","Type":"ContainerDied","Data":"8dc85bf58cf79a6bf9bd642d3c927578da0c2c2f595e83149b64a874d24cbdf8"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.385186 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" event={"ID":"ad127416-8f9c-4d2d-a13a-8a0b69525847","Type":"ContainerStarted","Data":"36438aa49eb5a528efe77428c77a18ce0acb6ed48eea1e5b8a02f0cd4e220f2a"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.392838 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rmkd\" (UniqueName: \"kubernetes.io/projected/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-kube-api-access-2rmkd\") pod \"marketplace-operator-79b997595-qtr2r\" (UID: \"2c98c8e5-b9f1-43ca-93f6-cb74695dd076\") " pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.403713 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" event={"ID":"28aad01a-534b-4d04-aceb-ad163db9871c","Type":"ContainerStarted","Data":"138e35151837d2bb55201d65794b5b089735db44ebc0adf8bbd7b46df41e02e7"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.403775 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" event={"ID":"28aad01a-534b-4d04-aceb-ad163db9871c","Type":"ContainerStarted","Data":"254a3cc3e591d73a7081b014b1bca0993d834b141a9c009801cd6dd042bf0cc1"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.404650 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" event={"ID":"66db1fbb-f050-4af3-977b-831602348a9b","Type":"ContainerStarted","Data":"1c35c8bf13475d15e931ea366aa012ad3edcb35af19397bd34067a7402d00d53"} Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.427244 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfk7d\" (UniqueName: \"kubernetes.io/projected/ac45e4d4-e19e-49d8-a1da-1c27442e7da7-kube-api-access-dfk7d\") pod \"ingress-canary-jrw7z\" (UID: \"ac45e4d4-e19e-49d8-a1da-1c27442e7da7\") " pod="openshift-ingress-canary/ingress-canary-jrw7z" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.427664 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:50 crc kubenswrapper[4907]: E1129 14:30:50.429343 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:50.929322301 +0000 UTC m=+148.916159953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.436116 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql9fm\" (UniqueName: \"kubernetes.io/projected/a259660c-b57b-4a89-9f33-19d3bb3f5a93-kube-api-access-ql9fm\") pod \"collect-profiles-29407110-d75hh\" (UID: \"a259660c-b57b-4a89-9f33-19d3bb3f5a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.472324 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.472579 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.482465 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl7zz\" (UniqueName: \"kubernetes.io/projected/2bd0fc3f-a365-48e4-bbe2-1509a739460a-kube-api-access-nl7zz\") pod \"openshift-config-operator-7777fb866f-mvnwh\" (UID: \"2bd0fc3f-a365-48e4-bbe2-1509a739460a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.486153 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1d7892f-1c0b-4106-b34d-b309f9c60807-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45ssg\" (UID: \"c1d7892f-1c0b-4106-b34d-b309f9c60807\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.493769 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.503284 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4dl5\" (UniqueName: \"kubernetes.io/projected/c67f836d-7f9f-4827-a8d5-0f3934f96b14-kube-api-access-v4dl5\") pod \"machine-config-controller-84d6567774-t4lwh\" (UID: \"c67f836d-7f9f-4827-a8d5-0f3934f96b14\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.504403 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.509100 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.521623 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dkwzk" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.536869 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.536887 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75hhz" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.542343 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.543091 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfd7r\" (UniqueName: \"kubernetes.io/projected/d36c3ddb-8f3c-477d-bf22-1e336519a83e-kube-api-access-tfd7r\") pod \"migrator-59844c95c7-hc9t9\" (UID: \"d36c3ddb-8f3c-477d-bf22-1e336519a83e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hc9t9" Nov 29 14:30:50 crc kubenswrapper[4907]: E1129 14:30:50.544548 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:51.044525679 +0000 UTC m=+149.031363331 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.551370 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.567330 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjdt6\" (UniqueName: \"kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-kube-api-access-gjdt6\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.567681 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.568473 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.577243 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn9lj\" (UniqueName: \"kubernetes.io/projected/595bcc18-9208-4034-b70d-be547bd61aa0-kube-api-access-xn9lj\") pod \"dns-default-jxwmf\" (UID: \"595bcc18-9208-4034-b70d-be547bd61aa0\") " pod="openshift-dns/dns-default-jxwmf" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.588041 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.594070 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn7qr\" (UniqueName: \"kubernetes.io/projected/be668353-b5df-42f6-bc63-9e896be4f7e7-kube-api-access-cn7qr\") pod \"csi-hostpathplugin-wl27q\" (UID: \"be668353-b5df-42f6-bc63-9e896be4f7e7\") " pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.594745 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jrw7z" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.602630 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jxwmf" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.616777 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlthh\" (UniqueName: \"kubernetes.io/projected/c630db4e-b02b-480e-bfe7-c93188434b14-kube-api-access-wlthh\") pod \"machine-config-server-2j64h\" (UID: \"c630db4e-b02b-480e-bfe7-c93188434b14\") " pod="openshift-machine-config-operator/machine-config-server-2j64h" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.624173 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wl27q" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.646011 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:50 crc kubenswrapper[4907]: E1129 14:30:50.646712 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:51.146695721 +0000 UTC m=+149.133533373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.736023 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj"] Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.738033 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2j64h" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.757772 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv"] Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.761425 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: E1129 14:30:50.763951 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:51.263939081 +0000 UTC m=+149.250776733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.772638 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gpnkx"] Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.777212 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jcqhw"] Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.779571 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw"] Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.788321 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hc9t9" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.797235 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh" Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.798479 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jcdhm"] Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.865197 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:50 crc kubenswrapper[4907]: E1129 14:30:50.865924 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:51.365904324 +0000 UTC m=+149.352741976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:50 crc kubenswrapper[4907]: W1129 14:30:50.921692 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b815226_63ff_4e97_bdcd_4a48ee001b99.slice/crio-db8f57a23befd6f562ee9a538cc7c6f25517a90529df9c69b6ef0cc6da7316e6 WatchSource:0}: Error finding container db8f57a23befd6f562ee9a538cc7c6f25517a90529df9c69b6ef0cc6da7316e6: Status 404 returned error can't find the container with id db8f57a23befd6f562ee9a538cc7c6f25517a90529df9c69b6ef0cc6da7316e6 Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.951628 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rthjj"] Nov 29 14:30:50 crc kubenswrapper[4907]: I1129 14:30:50.967072 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:50 crc kubenswrapper[4907]: E1129 14:30:50.967535 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:51.467498553 +0000 UTC m=+149.454336205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:50 crc kubenswrapper[4907]: W1129 14:30:50.999883 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb0351fd_226e_4fc2_b8f6_0132252c4e2e.slice/crio-5493cfceee76423c5bcfda3b6de4e7d8df8f3905a2ffee6ee79d9cccb4927f0d WatchSource:0}: Error finding container 5493cfceee76423c5bcfda3b6de4e7d8df8f3905a2ffee6ee79d9cccb4927f0d: Status 404 returned error can't find the container with id 5493cfceee76423c5bcfda3b6de4e7d8df8f3905a2ffee6ee79d9cccb4927f0d Nov 29 14:30:51 crc kubenswrapper[4907]: W1129 14:30:51.019876 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5769473_e380_4d0e_bfe4_aab057473a62.slice/crio-f2288d5c0dfacb1c2d4d73df8cd94c2ad2a05a08bba13b59fa142b98958b6b44 WatchSource:0}: Error finding container f2288d5c0dfacb1c2d4d73df8cd94c2ad2a05a08bba13b59fa142b98958b6b44: Status 404 returned error can't find the container with id f2288d5c0dfacb1c2d4d73df8cd94c2ad2a05a08bba13b59fa142b98958b6b44 Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.020862 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97"] Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.067902 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:51 crc kubenswrapper[4907]: E1129 14:30:51.068576 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:51.568539088 +0000 UTC m=+149.555376740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.125234 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hhl5h"] Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.133486 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mgntg"] Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.135024 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j46xp"] Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.154951 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g"] Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.169058 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:51 crc kubenswrapper[4907]: E1129 14:30:51.169312 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:51.669299712 +0000 UTC m=+149.656137364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.304134 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:51 crc kubenswrapper[4907]: E1129 14:30:51.305174 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:51.805157326 +0000 UTC m=+149.791994978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.313053 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm"] Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.406097 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:51 crc kubenswrapper[4907]: E1129 14:30:51.406537 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:51.906509384 +0000 UTC m=+149.893347036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.417086 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh"] Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.464319 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"db829e395bc7251195bef4ad14617a90b7a4ed68e7c3266df9e0d16927f8efc9"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.464369 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"64daa8450a6b13a4eb4e8986c1c8d12e7632e0ce9280cddd08a48c24fa1abba0"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.464876 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.476896 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" event={"ID":"66db1fbb-f050-4af3-977b-831602348a9b","Type":"ContainerStarted","Data":"3d44dfc3e593efd81b4bec211dee9306b242b2c68790daa839dbfc30db6e2475"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.477509 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.481525 4907 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-qdmt6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.481582 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" podUID="66db1fbb-f050-4af3-977b-831602348a9b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.481769 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pm8bc" event={"ID":"fa61d693-3aab-4537-9075-3f99fde5cb8d","Type":"ContainerStarted","Data":"06e4d5ac6c0c08e882949a3771cee8573f0f712ccbd7313ea2584ae4cf0be365"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.484129 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm" event={"ID":"2417b89a-c4ec-46b8-9e9f-95ec8cd1fbef","Type":"ContainerStarted","Data":"f4c8c38a57973d2effe1e81acb5ae78e24789c7adc17ea8ae2ba1c9205eefe53"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.497169 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4a01ee499d37d378609f9fbc18350f5401ab351738991d7efe6ff3259c0eff45"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.502066 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" event={"ID":"fd9557ee-7350-479a-80d7-20ae8c6a229d","Type":"ContainerStarted","Data":"bb7779d89914faba8c42c8bfb493faf039bd9b9fb6db5b4401805717aa3be44d"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.502958 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g" event={"ID":"87211b95-db9a-4429-be07-91b57e6355c3","Type":"ContainerStarted","Data":"f4aa2f1028b8f9f097388a4cfc2d84ad3b5cf443cb651ae9651832998447f68b"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.505047 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" event={"ID":"415a984e-e3ba-4e39-adb2-f79c8ed05f3f","Type":"ContainerStarted","Data":"d2312c7f8967527eaed614713b0d7b7d5dc820425f8ea3f3af9f83507890de92"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.507208 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" podStartSLOduration=131.507198226 podStartE2EDuration="2m11.507198226s" podCreationTimestamp="2025-11-29 14:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:51.468456739 +0000 UTC m=+149.455294391" watchObservedRunningTime="2025-11-29 14:30:51.507198226 +0000 UTC m=+149.494035878" Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.508144 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:51 crc kubenswrapper[4907]: E1129 14:30:51.508331 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:52.008314171 +0000 UTC m=+149.995151823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.508693 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j46xp" event={"ID":"ed407316-6a6a-47dc-99eb-6be44d35b22f","Type":"ContainerStarted","Data":"9de9a97fbbcfde37b6129af0bbefc532a3b5a788d32e8f6e72958621b2b1db73"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.509684 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.510688 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv" event={"ID":"e8000091-319c-4799-8068-77dd150352f4","Type":"ContainerStarted","Data":"727632f13a748df93df9b9696bb39506acd9d781a7cefe0e2d4092b6b3aa2066"} Nov 29 14:30:51 crc kubenswrapper[4907]: E1129 14:30:51.511128 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:52.011111244 +0000 UTC m=+149.997948896 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.514284 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh" event={"ID":"ccf1336b-0e9f-42d7-adca-45e017360773","Type":"ContainerStarted","Data":"50eb08d02ab0bbdcd453171b6dd069e23e382d3d24e8820f6f068570863b60b2"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.524390 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dwtxw" event={"ID":"f45e1ce1-c477-4a87-a3ab-821e702ce490","Type":"ContainerStarted","Data":"38846f7bf02efeea11f12e0f52e0e4afe64047d4e48e358d3c2e19ec237a4c27"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.524449 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dwtxw" event={"ID":"f45e1ce1-c477-4a87-a3ab-821e702ce490","Type":"ContainerStarted","Data":"20c20b3794ab8382a8f5e91e021135207824bf96f8fec979799dd3fa56a15919"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.532782 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" event={"ID":"c3efd626-bb2f-4257-ac3a-fbf6d7a76780","Type":"ContainerStarted","Data":"c978893e814bee6755a0a3cf6670b93d8719f19af964e28ddca1a2eef1e0eb4c"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.534556 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" event={"ID":"db0351fd-226e-4fc2-b8f6-0132252c4e2e","Type":"ContainerStarted","Data":"5493cfceee76423c5bcfda3b6de4e7d8df8f3905a2ffee6ee79d9cccb4927f0d"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.536231 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rthjj" event={"ID":"565df8ea-1d38-4c6d-98a6-d63a58c8df03","Type":"ContainerStarted","Data":"635910a4a7e01919bb8f8e997609d000b22309805575a42c7b1ddacd751564e3"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.539312 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" event={"ID":"a83e8666-c5ff-45bf-a1ed-584a2554e6aa","Type":"ContainerStarted","Data":"d53a5ece96472fc189e2f1a91f7d17b57075bc9e953beaa42af7f203b1350b9b"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.546955 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hhl5h" event={"ID":"5addff88-f08e-4fea-bc20-fb09e5e7d504","Type":"ContainerStarted","Data":"5bcd27c20d825c7e4444f905108fea52b3fc7eb6a0c799ac3ac09fc622ffb6af"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.550549 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" event={"ID":"d5769473-e380-4d0e-bfe4-aab057473a62","Type":"ContainerStarted","Data":"f2288d5c0dfacb1c2d4d73df8cd94c2ad2a05a08bba13b59fa142b98958b6b44"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.551561 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5b7e309c0eba5f609c8aad90bdffa2a1f72a9dc0c8ed859cc0da49700cc8fea4"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.554348 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" event={"ID":"ec75ca64-71c8-4a23-8723-378e0450548b","Type":"ContainerStarted","Data":"356654ba3d4e610fa5420b113aaa1bf01c4a3ef3c1da9b55d6b53595526a47ec"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.556123 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" event={"ID":"9b815226-63ff-4e97-bdcd-4a48ee001b99","Type":"ContainerStarted","Data":"db8f57a23befd6f562ee9a538cc7c6f25517a90529df9c69b6ef0cc6da7316e6"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.557710 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" event={"ID":"72fe0643-91a8-459e-aec7-257e5b07ea41","Type":"ContainerStarted","Data":"e015939c8f741020f9a67a2e83d3feb7bc827756298dddbcbd94ce4503562162"} Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.558339 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.583238 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lr54b" podStartSLOduration=130.58322022 podStartE2EDuration="2m10.58322022s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:51.542295845 +0000 UTC m=+149.529133517" watchObservedRunningTime="2025-11-29 14:30:51.58322022 +0000 UTC m=+149.570057872" Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.610370 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:51 crc kubenswrapper[4907]: E1129 14:30:51.610502 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:52.110485312 +0000 UTC m=+150.097322964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.611944 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:51 crc kubenswrapper[4907]: E1129 14:30:51.612535 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:52.112525505 +0000 UTC m=+150.099363157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:51 crc kubenswrapper[4907]: W1129 14:30:51.670287 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bd0fc3f_a365_48e4_bbe2_1509a739460a.slice/crio-d6d22ba5cdef4736eede3e5108544f3028595250dc564007510a8d4733e7e712 WatchSource:0}: Error finding container d6d22ba5cdef4736eede3e5108544f3028595250dc564007510a8d4733e7e712: Status 404 returned error can't find the container with id d6d22ba5cdef4736eede3e5108544f3028595250dc564007510a8d4733e7e712 Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.712682 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:51 crc kubenswrapper[4907]: E1129 14:30:51.714635 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:52.214620273 +0000 UTC m=+150.201457925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.787647 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.815160 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:51 crc kubenswrapper[4907]: E1129 14:30:51.815469 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:52.31545886 +0000 UTC m=+150.302296512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:51 crc kubenswrapper[4907]: I1129 14:30:51.915910 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:51 crc kubenswrapper[4907]: E1129 14:30:51.916331 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:52.416316549 +0000 UTC m=+150.403154201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.016718 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-5t44g" podStartSLOduration=131.016690167 podStartE2EDuration="2m11.016690167s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:52.001373568 +0000 UTC m=+149.988211210" watchObservedRunningTime="2025-11-29 14:30:52.016690167 +0000 UTC m=+150.003527819" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.019989 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:52 crc kubenswrapper[4907]: E1129 14:30:52.020276 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:52.520262512 +0000 UTC m=+150.507100164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.139874 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:52 crc kubenswrapper[4907]: E1129 14:30:52.140173 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:52.64015545 +0000 UTC m=+150.626993102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.147197 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg"] Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.220650 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.241869 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:52 crc kubenswrapper[4907]: E1129 14:30:52.242128 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:52.742117773 +0000 UTC m=+150.728955425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.266155 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-c572b" podStartSLOduration=131.266136594 podStartE2EDuration="2m11.266136594s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:52.229005482 +0000 UTC m=+150.215843124" watchObservedRunningTime="2025-11-29 14:30:52.266136594 +0000 UTC m=+150.252974236" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.313028 4907 patch_prober.go:28] interesting pod/router-default-5444994796-dwtxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 14:30:52 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Nov 29 14:30:52 crc kubenswrapper[4907]: [+]process-running ok Nov 29 14:30:52 crc kubenswrapper[4907]: healthz check failed Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.313092 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dwtxw" podUID="f45e1ce1-c477-4a87-a3ab-821e702ce490" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.342971 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:52 crc kubenswrapper[4907]: E1129 14:30:52.343432 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:52.843403428 +0000 UTC m=+150.830241080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.407359 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" podStartSLOduration=132.407339373 podStartE2EDuration="2m12.407339373s" podCreationTimestamp="2025-11-29 14:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:52.366871157 +0000 UTC m=+150.353708809" watchObservedRunningTime="2025-11-29 14:30:52.407339373 +0000 UTC m=+150.394177015" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.410237 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pv2xm" podStartSLOduration=131.41022729 podStartE2EDuration="2m11.41022729s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:52.391733642 +0000 UTC m=+150.378571294" watchObservedRunningTime="2025-11-29 14:30:52.41022729 +0000 UTC m=+150.397064942" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.447881 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:52 crc kubenswrapper[4907]: E1129 14:30:52.450170 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:52.950156285 +0000 UTC m=+150.936993927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.514754 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-dwtxw" podStartSLOduration=131.514737726 podStartE2EDuration="2m11.514737726s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:52.501267371 +0000 UTC m=+150.488105023" watchObservedRunningTime="2025-11-29 14:30:52.514737726 +0000 UTC m=+150.501575378" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.516519 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh"] Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.550950 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:52 crc kubenswrapper[4907]: E1129 14:30:52.551266 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:53.051250003 +0000 UTC m=+151.038087655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.606145 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djhlh" podStartSLOduration=131.606126311 podStartE2EDuration="2m11.606126311s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:52.565062171 +0000 UTC m=+150.551899823" watchObservedRunningTime="2025-11-29 14:30:52.606126311 +0000 UTC m=+150.592963963" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.606385 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" podStartSLOduration=131.606382422 podStartE2EDuration="2m11.606382422s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:52.604424763 +0000 UTC m=+150.591262415" watchObservedRunningTime="2025-11-29 14:30:52.606382422 +0000 UTC m=+150.593220074" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.633783 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" event={"ID":"d5769473-e380-4d0e-bfe4-aab057473a62","Type":"ContainerStarted","Data":"6096d7e1b7b94919504d18b309d38a3290288792af88e3db27a8ef10b064d853"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.634946 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7"] Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.662615 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:52 crc kubenswrapper[4907]: E1129 14:30:52.663218 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:53.16320606 +0000 UTC m=+151.150043712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.672049 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm" event={"ID":"cf8dba76-208a-4a46-8f25-50beb0d28ae6","Type":"ContainerStarted","Data":"456a1d3c582d442702ef0526218743f64c286ca3e2fcf3ebae10432f84ebeec6"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.672102 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm" event={"ID":"cf8dba76-208a-4a46-8f25-50beb0d28ae6","Type":"ContainerStarted","Data":"8c8fbd6f43085e3d309d002cdc3af4765e0180e0c97e91b99e4cb8b24643bcf4"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.739394 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rthjj" event={"ID":"565df8ea-1d38-4c6d-98a6-d63a58c8df03","Type":"ContainerStarted","Data":"02ab57b7df8e580c35b3cb087f5f9e436f1b2d2c91ba1f222be989a07b5b7223"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.739470 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-rthjj" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.742138 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pm8bc" event={"ID":"fa61d693-3aab-4537-9075-3f99fde5cb8d","Type":"ContainerStarted","Data":"ae2f168d5fa0cf956c3b575d17a1c05fe624ceb9d872075752382757673e0305"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.752170 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-rthjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.752221 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rthjj" podUID="565df8ea-1d38-4c6d-98a6-d63a58c8df03" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.42:8080/\": dial tcp 10.217.0.42:8080: connect: connection refused" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.754970 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" event={"ID":"9b815226-63ff-4e97-bdcd-4a48ee001b99","Type":"ContainerStarted","Data":"e2738a835c0616fd3b14e8bc5754128e3f0ee358ed33ed3796202ea4d00da2cc"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.769786 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:52 crc kubenswrapper[4907]: E1129 14:30:52.771547 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:53.27152852 +0000 UTC m=+151.258366172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.789902 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2j64h" event={"ID":"c630db4e-b02b-480e-bfe7-c93188434b14","Type":"ContainerStarted","Data":"1acc49316444c992b3c0a8720b0c794ae0dce9fc57b2a16a999d7d4db79bff52"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.792585 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" event={"ID":"a259660c-b57b-4a89-9f33-19d3bb3f5a93","Type":"ContainerStarted","Data":"811c3c8d26a7c0b94c501a69463410df1338b4cf90ed082e7b04d2c2a9fe7fd3"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.797678 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg" event={"ID":"c1d7892f-1c0b-4106-b34d-b309f9c60807","Type":"ContainerStarted","Data":"0d46f66188223640e447835aa9399415cd45c4811fff8d4dc9c33cd1abcb83f5"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.801140 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" event={"ID":"fd9557ee-7350-479a-80d7-20ae8c6a229d","Type":"ContainerStarted","Data":"37d51535ea92564e03a41ffb264988a58440570c901930a9fa99b3a274cd0efa"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.809498 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jrw7z"] Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.809910 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw"] Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.813767 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv" event={"ID":"e8000091-319c-4799-8068-77dd150352f4","Type":"ContainerStarted","Data":"1cd2637c20d8258128e7c13f99214db0a4a73ba9760e2e012d79eab407c039c2"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.841735 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dkwzk"] Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.848640 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" event={"ID":"db0351fd-226e-4fc2-b8f6-0132252c4e2e","Type":"ContainerStarted","Data":"bbf7af234d50d7a46d722a3fcb4f001f6e27411606539efc2e3c2bf4d02ceced"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.852231 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" event={"ID":"415a984e-e3ba-4e39-adb2-f79c8ed05f3f","Type":"ContainerStarted","Data":"cb43dc0696c17bc0074ec14fc29159f1232bb49d52fdeba0f10652fd02c0aa9b"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.853401 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.857011 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hhl5h" event={"ID":"5addff88-f08e-4fea-bc20-fb09e5e7d504","Type":"ContainerStarted","Data":"ae03f1ffcd76cc7cd369dc8d14545960b1f57aaf4f2959ece447ddae71967147"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.857702 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.872723 4907 patch_prober.go:28] interesting pod/console-operator-58897d9998-hhl5h container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.872785 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hhl5h" podUID="5addff88-f08e-4fea-bc20-fb09e5e7d504" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.872876 4907 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-gpnkx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.872889 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" podUID="415a984e-e3ba-4e39-adb2-f79c8ed05f3f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.874219 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75hhz"] Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.877081 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hc9t9"] Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.880360 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n"] Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.886098 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g"] Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.886139 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" event={"ID":"ad127416-8f9c-4d2d-a13a-8a0b69525847","Type":"ContainerStarted","Data":"f73f681ad0b73e137fa0eb425886e51ab326416205459029e8d6433d2d9daca4"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.900224 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:52 crc kubenswrapper[4907]: E1129 14:30:52.903677 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:53.403662472 +0000 UTC m=+151.390500124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.908718 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" event={"ID":"2bd0fc3f-a365-48e4-bbe2-1509a739460a","Type":"ContainerStarted","Data":"d6d22ba5cdef4736eede3e5108544f3028595250dc564007510a8d4733e7e712"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.908776 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wl27q"] Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.917211 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" event={"ID":"7a422846-a8dc-47ce-912d-6444fb22b575","Type":"ContainerStarted","Data":"e66ca2b83d7d3ffc1e216622f160a41e465fe17f53156858a4bc507281dabd69"} Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.931259 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jxwmf"] Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.933317 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.942513 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtr2r"] Nov 29 14:30:52 crc kubenswrapper[4907]: I1129 14:30:52.942567 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh"] Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.001794 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:53 crc kubenswrapper[4907]: E1129 14:30:53.002218 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:53.502155904 +0000 UTC m=+151.488993566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.056305 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-jcqhw" podStartSLOduration=132.056289833 podStartE2EDuration="2m12.056289833s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:53.05473993 +0000 UTC m=+151.041577582" watchObservedRunningTime="2025-11-29 14:30:53.056289833 +0000 UTC m=+151.043127485" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.079627 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.080565 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.104368 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:53 crc kubenswrapper[4907]: E1129 14:30:53.106941 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:53.606926711 +0000 UTC m=+151.593764363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.108887 4907 patch_prober.go:28] interesting pod/apiserver-76f77b778f-mtvpt container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 29 14:30:53 crc kubenswrapper[4907]: [+]log ok Nov 29 14:30:53 crc kubenswrapper[4907]: [+]etcd ok Nov 29 14:30:53 crc kubenswrapper[4907]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 29 14:30:53 crc kubenswrapper[4907]: [+]poststarthook/generic-apiserver-start-informers ok Nov 29 14:30:53 crc kubenswrapper[4907]: [+]poststarthook/max-in-flight-filter ok Nov 29 14:30:53 crc kubenswrapper[4907]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 29 14:30:53 crc kubenswrapper[4907]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 29 14:30:53 crc kubenswrapper[4907]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 29 14:30:53 crc kubenswrapper[4907]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Nov 29 14:30:53 crc kubenswrapper[4907]: [+]poststarthook/project.openshift.io-projectcache ok Nov 29 14:30:53 crc kubenswrapper[4907]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 29 14:30:53 crc kubenswrapper[4907]: [+]poststarthook/openshift.io-startinformers ok Nov 29 14:30:53 crc kubenswrapper[4907]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 29 14:30:53 crc kubenswrapper[4907]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 29 14:30:53 crc kubenswrapper[4907]: livez check failed Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.108976 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" podUID="28aad01a-534b-4d04-aceb-ad163db9871c" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.119840 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-hhl5h" podStartSLOduration=132.119822682 podStartE2EDuration="2m12.119822682s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:53.090073829 +0000 UTC m=+151.076911481" watchObservedRunningTime="2025-11-29 14:30:53.119822682 +0000 UTC m=+151.106660334" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.120359 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mjqv" podStartSLOduration=133.120354024 podStartE2EDuration="2m13.120354024s" podCreationTimestamp="2025-11-29 14:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:53.115857692 +0000 UTC m=+151.102695344" watchObservedRunningTime="2025-11-29 14:30:53.120354024 +0000 UTC m=+151.107191676" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.191584 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-2j64h" podStartSLOduration=6.191568553 podStartE2EDuration="6.191568553s" podCreationTimestamp="2025-11-29 14:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:53.155067787 +0000 UTC m=+151.141905449" watchObservedRunningTime="2025-11-29 14:30:53.191568553 +0000 UTC m=+151.178406205" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.205352 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:53 crc kubenswrapper[4907]: E1129 14:30:53.206546 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:53.706530068 +0000 UTC m=+151.693367710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.234733 4907 patch_prober.go:28] interesting pod/router-default-5444994796-dwtxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 14:30:53 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Nov 29 14:30:53 crc kubenswrapper[4907]: [+]process-running ok Nov 29 14:30:53 crc kubenswrapper[4907]: healthz check failed Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.234865 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dwtxw" podUID="f45e1ce1-c477-4a87-a3ab-821e702ce490" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.303870 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pkrnj" podStartSLOduration=132.303849913 podStartE2EDuration="2m12.303849913s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:53.302857603 +0000 UTC m=+151.289695245" watchObservedRunningTime="2025-11-29 14:30:53.303849913 +0000 UTC m=+151.290687555" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.307944 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:53 crc kubenswrapper[4907]: E1129 14:30:53.308651 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:53.808637347 +0000 UTC m=+151.795474999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.365928 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pm8bc" podStartSLOduration=133.365913693 podStartE2EDuration="2m13.365913693s" podCreationTimestamp="2025-11-29 14:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:53.328647416 +0000 UTC m=+151.315485068" watchObservedRunningTime="2025-11-29 14:30:53.365913693 +0000 UTC m=+151.352751345" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.366233 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kd4xm" podStartSLOduration=132.366230106 podStartE2EDuration="2m12.366230106s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:53.363809988 +0000 UTC m=+151.350647640" watchObservedRunningTime="2025-11-29 14:30:53.366230106 +0000 UTC m=+151.353067758" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.388643 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-rthjj" podStartSLOduration=132.388623741 podStartE2EDuration="2m12.388623741s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:53.388462135 +0000 UTC m=+151.375299787" watchObservedRunningTime="2025-11-29 14:30:53.388623741 +0000 UTC m=+151.375461393" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.410076 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:53 crc kubenswrapper[4907]: E1129 14:30:53.410375 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:53.91035934 +0000 UTC m=+151.897196992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.427672 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" podStartSLOduration=132.42765673 podStartE2EDuration="2m12.42765673s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:53.425973991 +0000 UTC m=+151.412811653" watchObservedRunningTime="2025-11-29 14:30:53.42765673 +0000 UTC m=+151.414494382" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.510939 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:53 crc kubenswrapper[4907]: E1129 14:30:53.511252 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:54.011240489 +0000 UTC m=+151.998078141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.616273 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:53 crc kubenswrapper[4907]: E1129 14:30:53.616642 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:54.11661193 +0000 UTC m=+152.103449582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.619109 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:53 crc kubenswrapper[4907]: E1129 14:30:53.619557 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:54.119542419 +0000 UTC m=+152.106380071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.720257 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:53 crc kubenswrapper[4907]: E1129 14:30:53.721419 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:54.221394907 +0000 UTC m=+152.208232559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.821206 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:53 crc kubenswrapper[4907]: E1129 14:30:53.821761 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:54.321749235 +0000 UTC m=+152.308586887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.923012 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:53 crc kubenswrapper[4907]: E1129 14:30:53.923351 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:54.423337143 +0000 UTC m=+152.410174795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.932347 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" event={"ID":"26174366-e2ac-42d4-b0aa-ce6dd5e50ade","Type":"ContainerStarted","Data":"7f10bc0343cfb115185e02798b2ec34f600e4eed058d2cc652b1ff7be79a54b6"} Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.932388 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" event={"ID":"26174366-e2ac-42d4-b0aa-ce6dd5e50ade","Type":"ContainerStarted","Data":"0d600166862f056e04581d9b2a1ef98e4701013e26d5601cb4e3ad908d87370d"} Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.937790 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75hhz" event={"ID":"0c5dabfe-62e3-4104-9939-59e4832c6484","Type":"ContainerStarted","Data":"7861223e15745cb2a9985265070b6bbe9d12b310a1aab6f30203ee40d13583d1"} Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.937998 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75hhz" event={"ID":"0c5dabfe-62e3-4104-9939-59e4832c6484","Type":"ContainerStarted","Data":"e4d85c843359ade5ba9d9b6016a398d168b380365227f74cfbf8612db3eb2a64"} Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.951107 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" event={"ID":"00ad88f1-cc82-4e00-8b0a-ddb548db43fa","Type":"ContainerStarted","Data":"9049fcc67bc62f325f0445edbdc4b0f59f299b17cb28b81052ba282578b6f35e"} Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.951303 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" event={"ID":"00ad88f1-cc82-4e00-8b0a-ddb548db43fa","Type":"ContainerStarted","Data":"bb56b7fd7f7034b6a7d59480276ad17ef7c742c255c8308552f5259b6fd56e71"} Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.951744 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.952608 4907 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-vw9j7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.952716 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" podUID="00ad88f1-cc82-4e00-8b0a-ddb548db43fa" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.959241 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dkwzk" event={"ID":"fdc08015-ab62-4a36-a1f6-e17514ac47dd","Type":"ContainerStarted","Data":"8339d024bf1e7aff2df4338aac161f1892d29025247222d80b1984f7e0471a31"} Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.960449 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jrw7z" event={"ID":"ac45e4d4-e19e-49d8-a1da-1c27442e7da7","Type":"ContainerStarted","Data":"40050dc8bd75ba48243d44795b829838322777321b2710650f533368bd20e479"} Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.966729 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" event={"ID":"c3efd626-bb2f-4257-ac3a-fbf6d7a76780","Type":"ContainerStarted","Data":"5e25308424c833f390de5fdf220e086105153847bb03aa0ffe294e08ee3959c7"} Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.967289 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.970818 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh" event={"ID":"c67f836d-7f9f-4827-a8d5-0f3934f96b14","Type":"ContainerStarted","Data":"c1929cdf4b635f4f2e3f65244b3393c5e3ce62d1319e7c2d513649d00af50b63"} Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.972811 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" event={"ID":"a259660c-b57b-4a89-9f33-19d3bb3f5a93","Type":"ContainerStarted","Data":"602193d41e220ce3274d996b139832df814b3285c233c0710393a7ec24970b82"} Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.978508 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2j64h" event={"ID":"c630db4e-b02b-480e-bfe7-c93188434b14","Type":"ContainerStarted","Data":"45c73e4f2759bcf085156edbc0dc5b1456835a068eac7164fb76688018a32b4b"} Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.981178 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.988619 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" podStartSLOduration=132.988603082 podStartE2EDuration="2m12.988603082s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:53.453187932 +0000 UTC m=+151.440025584" watchObservedRunningTime="2025-11-29 14:30:53.988603082 +0000 UTC m=+151.975440734" Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.990929 4907 generic.go:334] "Generic (PLEG): container finished" podID="2bd0fc3f-a365-48e4-bbe2-1509a739460a" containerID="732e4ec43c6e4c2ea9aef003381b8ae05634bce481e48b69226bb6b6450640b1" exitCode=0 Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.991042 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" event={"ID":"2bd0fc3f-a365-48e4-bbe2-1509a739460a","Type":"ContainerStarted","Data":"cdf7a307b45fd7d374eea2049bc2d5078da8660dd0047c6e69305a951f94398c"} Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.991824 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" event={"ID":"2bd0fc3f-a365-48e4-bbe2-1509a739460a","Type":"ContainerDied","Data":"732e4ec43c6e4c2ea9aef003381b8ae05634bce481e48b69226bb6b6450640b1"} Nov 29 14:30:53 crc kubenswrapper[4907]: I1129 14:30:53.991914 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.000043 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg" event={"ID":"c1d7892f-1c0b-4106-b34d-b309f9c60807","Type":"ContainerStarted","Data":"241e4f68f2c046d615ba2af5923eaceceb9561c093d99dd508f4f360816be69d"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.020875 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75hhz" podStartSLOduration=133.020857156 podStartE2EDuration="2m13.020857156s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:53.98781913 +0000 UTC m=+151.974656772" watchObservedRunningTime="2025-11-29 14:30:54.020857156 +0000 UTC m=+152.007694808" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.023834 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" event={"ID":"7a422846-a8dc-47ce-912d-6444fb22b575","Type":"ContainerStarted","Data":"4f8a3c41b04b360b1a4ed1730c88d2920880ce100f0cf4235338c9f56b71159c"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.024378 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:54 crc kubenswrapper[4907]: E1129 14:30:54.024827 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:54.524811056 +0000 UTC m=+152.511648698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.040512 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pm8bc" event={"ID":"fa61d693-3aab-4537-9075-3f99fde5cb8d","Type":"ContainerStarted","Data":"4a338306a76a7d7227f27924ebd7e4ca479cc2ac943c109b20180084df756211"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.049118 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" podStartSLOduration=54.049101508 podStartE2EDuration="54.049101508s" podCreationTimestamp="2025-11-29 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:54.022868627 +0000 UTC m=+152.009706279" watchObservedRunningTime="2025-11-29 14:30:54.049101508 +0000 UTC m=+152.035939160" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.049567 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-54x97" podStartSLOduration=133.049563977 podStartE2EDuration="2m13.049563977s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:54.047775254 +0000 UTC m=+152.034612906" watchObservedRunningTime="2025-11-29 14:30:54.049563977 +0000 UTC m=+152.036401629" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.090189 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g" event={"ID":"87211b95-db9a-4429-be07-91b57e6355c3","Type":"ContainerStarted","Data":"227cb5f13c1b2a482fe1bf0e4bb9e9291184ea19bd07b1d0a866a13c0cf37c0f"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.125212 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" podStartSLOduration=133.125197245 podStartE2EDuration="2m13.125197245s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:54.105691176 +0000 UTC m=+152.092528828" watchObservedRunningTime="2025-11-29 14:30:54.125197245 +0000 UTC m=+152.112034897" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.138693 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:54 crc kubenswrapper[4907]: E1129 14:30:54.139181 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:54.63916502 +0000 UTC m=+152.626002672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.139659 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:54 crc kubenswrapper[4907]: E1129 14:30:54.140266 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:54.640249444 +0000 UTC m=+152.627087096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.159240 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" event={"ID":"db0351fd-226e-4fc2-b8f6-0132252c4e2e","Type":"ContainerStarted","Data":"562e92c6f38be869e27c2365f361984823da55623af5377ad12af59c9f3fd718"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.160366 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" podStartSLOduration=133.160352597 podStartE2EDuration="2m13.160352597s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:54.159157768 +0000 UTC m=+152.145995420" watchObservedRunningTime="2025-11-29 14:30:54.160352597 +0000 UTC m=+152.147190239" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.225321 4907 patch_prober.go:28] interesting pod/router-default-5444994796-dwtxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 14:30:54 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Nov 29 14:30:54 crc kubenswrapper[4907]: [+]process-running ok Nov 29 14:30:54 crc kubenswrapper[4907]: healthz check failed Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.225718 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j46xp" event={"ID":"ed407316-6a6a-47dc-99eb-6be44d35b22f","Type":"ContainerStarted","Data":"cfd798cfebae41086ac38cfc6b38d083ce9f3f3b616ec44e86accc7db25891c6"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.225860 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j46xp" event={"ID":"ed407316-6a6a-47dc-99eb-6be44d35b22f","Type":"ContainerStarted","Data":"f5cb161a755e9aee3b63921cc6de6310d6020dfdbac9cdf736a5b8b3327bb7ef"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.225800 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dwtxw" podUID="f45e1ce1-c477-4a87-a3ab-821e702ce490" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.233276 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hc9t9" event={"ID":"d36c3ddb-8f3c-477d-bf22-1e336519a83e","Type":"ContainerStarted","Data":"c107191642b82b59961487b719676665b59bb59c91670f1bd17c772d37767bcf"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.233325 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hc9t9" event={"ID":"d36c3ddb-8f3c-477d-bf22-1e336519a83e","Type":"ContainerStarted","Data":"570ca2ca2ab5ff0022d21e7a9e9a2a4ba4f9234e1ea5d1612a571cff2b5c68b7"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.242414 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:54 crc kubenswrapper[4907]: E1129 14:30:54.244359 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:54.744331022 +0000 UTC m=+152.731168674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.268139 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wl27q" event={"ID":"be668353-b5df-42f6-bc63-9e896be4f7e7","Type":"ContainerStarted","Data":"dddd3b2181ed578c58da0b314afa1e749a9cb1b185ec5537a02b7fefab37a66c"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.284928 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45ssg" podStartSLOduration=133.284913173 podStartE2EDuration="2m13.284913173s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:54.207707221 +0000 UTC m=+152.194544873" watchObservedRunningTime="2025-11-29 14:30:54.284913173 +0000 UTC m=+152.271750825" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.286130 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" event={"ID":"6c8db7d8-f286-4ca7-81ad-76198fcc5178","Type":"ContainerStarted","Data":"f67990ba1ee5a612c26d9c3970bd7ccf36fdd9bafef6f4574a14c5854966b872"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.286180 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" event={"ID":"6c8db7d8-f286-4ca7-81ad-76198fcc5178","Type":"ContainerStarted","Data":"d41b69570b429c1c0d879798652d076fb8fcd7036c054dca641d9ae82ec12450"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.287144 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.295062 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jxwmf" event={"ID":"595bcc18-9208-4034-b70d-be547bd61aa0","Type":"ContainerStarted","Data":"85b4cac2589bc93089f287170a1d9c079add486b44397ddac20b6c988fbf27d0"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.297275 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" event={"ID":"2c98c8e5-b9f1-43ca-93f6-cb74695dd076","Type":"ContainerStarted","Data":"cd5127b44244351f1b7c47a819489bcd00fe63db5cc51afe60565c3ca72163a8"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.298608 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.313917 4907 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qtr2r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.313973 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" podUID="2c98c8e5-b9f1-43ca-93f6-cb74695dd076" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.334313 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.344203 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.344456 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8jxfs" podStartSLOduration=134.34442494 podStartE2EDuration="2m14.34442494s" podCreationTimestamp="2025-11-29 14:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:54.285347661 +0000 UTC m=+152.272185313" watchObservedRunningTime="2025-11-29 14:30:54.34442494 +0000 UTC m=+152.331262592" Nov 29 14:30:54 crc kubenswrapper[4907]: E1129 14:30:54.344649 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:54.844633648 +0000 UTC m=+152.831471300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.344952 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n" event={"ID":"429c8834-660f-4631-889b-f00c5f79f30a","Type":"ContainerStarted","Data":"ab52898ecef55462fba95f27d8334aab0cac12be714e8b4c9d6cf45b2ebc9f0e"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.344993 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n" event={"ID":"429c8834-660f-4631-889b-f00c5f79f30a","Type":"ContainerStarted","Data":"26b3af8ff8282d86db48dd2bcb15bb6b26786dbeb84e33284d8e14b334fe007f"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.345636 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.373750 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" event={"ID":"d5769473-e380-4d0e-bfe4-aab057473a62","Type":"ContainerStarted","Data":"043cc32d1bd35a7aff85cf7a79ba26debf159a8e57a0c9789e83d39240428779"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.392459 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" event={"ID":"a83e8666-c5ff-45bf-a1ed-584a2554e6aa","Type":"ContainerStarted","Data":"1fbf00556712d04e31d564f4b9728a5fe2c4af9dca8838681e6b7ee85959a04d"} Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.413719 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-rthjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.413785 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rthjj" podUID="565df8ea-1d38-4c6d-98a6-d63a58c8df03" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.42:8080/\": dial tcp 10.217.0.42:8080: connect: connection refused" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.414748 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.415702 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.416015 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.433702 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-hhl5h" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.437755 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.446536 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:54 crc kubenswrapper[4907]: E1129 14:30:54.447647 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:54.947631173 +0000 UTC m=+152.934468825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.456490 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjc6g" podStartSLOduration=133.456473 podStartE2EDuration="2m13.456473s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:54.342882067 +0000 UTC m=+152.329719719" watchObservedRunningTime="2025-11-29 14:30:54.456473 +0000 UTC m=+152.443310642" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.457680 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v2jw" podStartSLOduration=133.457675439 podStartE2EDuration="2m13.457675439s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:54.449329352 +0000 UTC m=+152.436167004" watchObservedRunningTime="2025-11-29 14:30:54.457675439 +0000 UTC m=+152.444513091" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.531000 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-j46xp" podStartSLOduration=133.530986433 podStartE2EDuration="2m13.530986433s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:54.529977873 +0000 UTC m=+152.516815515" watchObservedRunningTime="2025-11-29 14:30:54.530986433 +0000 UTC m=+152.517824085" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.550751 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:54 crc kubenswrapper[4907]: E1129 14:30:54.558556 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:55.058535407 +0000 UTC m=+153.045373059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.589955 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" podStartSLOduration=133.589936607 podStartE2EDuration="2m13.589936607s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:54.588412935 +0000 UTC m=+152.575250587" watchObservedRunningTime="2025-11-29 14:30:54.589936607 +0000 UTC m=+152.576774259" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.656491 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:54 crc kubenswrapper[4907]: E1129 14:30:54.656728 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:55.156713617 +0000 UTC m=+153.143551269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.766788 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:54 crc kubenswrapper[4907]: E1129 14:30:54.778344 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:55.278096796 +0000 UTC m=+153.264934478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.780596 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-jcdhm" podStartSLOduration=133.780582826 podStartE2EDuration="2m13.780582826s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:54.779522623 +0000 UTC m=+152.766360275" watchObservedRunningTime="2025-11-29 14:30:54.780582826 +0000 UTC m=+152.767420478" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.781082 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f54qw" podStartSLOduration=133.781075766 podStartE2EDuration="2m13.781075766s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:54.681823113 +0000 UTC m=+152.668660765" watchObservedRunningTime="2025-11-29 14:30:54.781075766 +0000 UTC m=+152.767913408" Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.875290 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:54 crc kubenswrapper[4907]: E1129 14:30:54.875826 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:55.375804536 +0000 UTC m=+153.362642188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:54 crc kubenswrapper[4907]: I1129 14:30:54.978281 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:54 crc kubenswrapper[4907]: E1129 14:30:54.978696 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:55.478669566 +0000 UTC m=+153.465507218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.046302 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n" podStartSLOduration=134.04628047 podStartE2EDuration="2m14.04628047s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:55.010355327 +0000 UTC m=+152.997192979" watchObservedRunningTime="2025-11-29 14:30:55.04628047 +0000 UTC m=+153.033118122" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.080499 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:55 crc kubenswrapper[4907]: E1129 14:30:55.080976 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:55.580955132 +0000 UTC m=+153.567792784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.181889 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:55 crc kubenswrapper[4907]: E1129 14:30:55.182204 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:55.682191715 +0000 UTC m=+153.669029367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.218885 4907 patch_prober.go:28] interesting pod/router-default-5444994796-dwtxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 14:30:55 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Nov 29 14:30:55 crc kubenswrapper[4907]: [+]process-running ok Nov 29 14:30:55 crc kubenswrapper[4907]: healthz check failed Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.218941 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dwtxw" podUID="f45e1ce1-c477-4a87-a3ab-821e702ce490" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.283081 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:55 crc kubenswrapper[4907]: E1129 14:30:55.283259 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:55.783233721 +0000 UTC m=+153.770071373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.283393 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:55 crc kubenswrapper[4907]: E1129 14:30:55.283736 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:55.783728811 +0000 UTC m=+153.770566463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.384375 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:55 crc kubenswrapper[4907]: E1129 14:30:55.384600 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:55.884575289 +0000 UTC m=+153.871412941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.384851 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:55 crc kubenswrapper[4907]: E1129 14:30:55.385255 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:55.885247836 +0000 UTC m=+153.872085488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.393614 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh" event={"ID":"c67f836d-7f9f-4827-a8d5-0f3934f96b14","Type":"ContainerStarted","Data":"5800d3a323730ee769d59586baaa4b178ead0e584364e67226e7c72705c80586"} Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.393883 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh" event={"ID":"c67f836d-7f9f-4827-a8d5-0f3934f96b14","Type":"ContainerStarted","Data":"cde045fcedde5a8cc54c767efa5443c028df7eceac834103a72c646e0001c9f3"} Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.395638 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jxwmf" event={"ID":"595bcc18-9208-4034-b70d-be547bd61aa0","Type":"ContainerStarted","Data":"53ec7d74ffa5d74fefcb30020098311102939d4c461384fa929131fd59946379"} Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.395659 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jxwmf" event={"ID":"595bcc18-9208-4034-b70d-be547bd61aa0","Type":"ContainerStarted","Data":"6bba60d2335fc13c9b7640d67db26d724fe7fe3c91b5c4fb24a4806db3c6035e"} Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.396170 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-jxwmf" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.397244 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" event={"ID":"2c98c8e5-b9f1-43ca-93f6-cb74695dd076","Type":"ContainerStarted","Data":"2963f2ecc0957dfa3dd57ccce4fe0af6dd55e43b657e7da6e48ea1470582aee4"} Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.398406 4907 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qtr2r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.398450 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" podUID="2c98c8e5-b9f1-43ca-93f6-cb74695dd076" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.406886 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n" event={"ID":"429c8834-660f-4631-889b-f00c5f79f30a","Type":"ContainerStarted","Data":"f7698fbca8a34bc3a6d8b0d071787c4b84d5cf0915870ef9af055a987b5b88f6"} Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.419145 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jrw7z" event={"ID":"ac45e4d4-e19e-49d8-a1da-1c27442e7da7","Type":"ContainerStarted","Data":"859f8e9c63ba110de68e6305d42de8d9996d721606bb037b69b4b3b0a4db26aa"} Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.425586 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wl27q" event={"ID":"be668353-b5df-42f6-bc63-9e896be4f7e7","Type":"ContainerStarted","Data":"70ff2eebc06801f7ffe4482018a3267203c39bb23fef26d2a8ff27a4c04d17c9"} Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.427698 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t4lwh" podStartSLOduration=134.427679632 podStartE2EDuration="2m14.427679632s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:55.426959903 +0000 UTC m=+153.413797555" watchObservedRunningTime="2025-11-29 14:30:55.427679632 +0000 UTC m=+153.414517284" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.429353 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-mgntg" podStartSLOduration=135.429346119 podStartE2EDuration="2m15.429346119s" podCreationTimestamp="2025-11-29 14:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:55.052983531 +0000 UTC m=+153.039821183" watchObservedRunningTime="2025-11-29 14:30:55.429346119 +0000 UTC m=+153.416183771" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.431362 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dkwzk" event={"ID":"fdc08015-ab62-4a36-a1f6-e17514ac47dd","Type":"ContainerStarted","Data":"c7118528c1500c8c422b9e92dc6924604454b86762b69c5fcb260c2857b814fd"} Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.431405 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dkwzk" event={"ID":"fdc08015-ab62-4a36-a1f6-e17514ac47dd","Type":"ContainerStarted","Data":"47ea74263c2813727c480c9c513e8955884d2042d0bbf6b3e30de59a1618aa1e"} Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.434471 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hc9t9" event={"ID":"d36c3ddb-8f3c-477d-bf22-1e336519a83e","Type":"ContainerStarted","Data":"1ccebc1bd9f317e447770b3d75321501dad6cd4309101541aa398ae5bf2d3dfc"} Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.441087 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" event={"ID":"26174366-e2ac-42d4-b0aa-ce6dd5e50ade","Type":"ContainerStarted","Data":"95cb3389301482a06ef759f5de1077fac52e3d420557b11ab3b5838102152ace"} Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.460383 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4p48m" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.466778 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jrw7z" podStartSLOduration=8.466756032 podStartE2EDuration="8.466756032s" podCreationTimestamp="2025-11-29 14:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:55.46622476 +0000 UTC m=+153.453062412" watchObservedRunningTime="2025-11-29 14:30:55.466756032 +0000 UTC m=+153.453593684" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.470975 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vw9j7" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.492880 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.494088 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jxwmf" podStartSLOduration=8.494071766 podStartE2EDuration="8.494071766s" podCreationTimestamp="2025-11-29 14:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:55.493727602 +0000 UTC m=+153.480565254" watchObservedRunningTime="2025-11-29 14:30:55.494071766 +0000 UTC m=+153.480909418" Nov 29 14:30:55 crc kubenswrapper[4907]: E1129 14:30:55.494490 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:55.994457772 +0000 UTC m=+153.981295424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.596809 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:55 crc kubenswrapper[4907]: E1129 14:30:55.606211 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:56.10618783 +0000 UTC m=+154.093025482 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.676148 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-dkwzk" podStartSLOduration=134.676121828 podStartE2EDuration="2m14.676121828s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:55.592703235 +0000 UTC m=+153.579540887" watchObservedRunningTime="2025-11-29 14:30:55.676121828 +0000 UTC m=+153.662959480" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.676302 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.677056 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.685695 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.685794 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.698032 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:55 crc kubenswrapper[4907]: E1129 14:30:55.716790 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:56.21674646 +0000 UTC m=+154.203584112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.769126 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.793881 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tjtqd"] Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.795145 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.807959 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b6710e5-a0a7-43d2-ba6b-62c3187099e7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0b6710e5-a0a7-43d2-ba6b-62c3187099e7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.809098 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b6710e5-a0a7-43d2-ba6b-62c3187099e7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0b6710e5-a0a7-43d2-ba6b-62c3187099e7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.809228 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:55 crc kubenswrapper[4907]: E1129 14:30:55.809636 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:56.309622266 +0000 UTC m=+154.296459918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.844139 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tjtqd"] Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.876715 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.910330 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.910522 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b6710e5-a0a7-43d2-ba6b-62c3187099e7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0b6710e5-a0a7-43d2-ba6b-62c3187099e7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.910606 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-catalog-content\") pod \"community-operators-tjtqd\" (UID: \"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74\") " pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.910634 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99mgl\" (UniqueName: \"kubernetes.io/projected/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-kube-api-access-99mgl\") pod \"community-operators-tjtqd\" (UID: \"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74\") " pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.910665 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-utilities\") pod \"community-operators-tjtqd\" (UID: \"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74\") " pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.910698 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b6710e5-a0a7-43d2-ba6b-62c3187099e7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0b6710e5-a0a7-43d2-ba6b-62c3187099e7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 14:30:55 crc kubenswrapper[4907]: E1129 14:30:55.910985 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:56.410961814 +0000 UTC m=+154.397799466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.911102 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b6710e5-a0a7-43d2-ba6b-62c3187099e7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0b6710e5-a0a7-43d2-ba6b-62c3187099e7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.941865 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2l24g" podStartSLOduration=134.941848313 podStartE2EDuration="2m14.941848313s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:55.93733158 +0000 UTC m=+153.924169232" watchObservedRunningTime="2025-11-29 14:30:55.941848313 +0000 UTC m=+153.928685965" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.949480 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g2mls"] Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.950400 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:30:55 crc kubenswrapper[4907]: I1129 14:30:55.988843 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.013067 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.013470 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-catalog-content\") pod \"community-operators-tjtqd\" (UID: \"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74\") " pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.013553 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99mgl\" (UniqueName: \"kubernetes.io/projected/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-kube-api-access-99mgl\") pod \"community-operators-tjtqd\" (UID: \"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74\") " pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.013633 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-utilities\") pod \"community-operators-tjtqd\" (UID: \"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74\") " pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.014148 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-utilities\") pod \"community-operators-tjtqd\" (UID: \"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74\") " pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:30:56 crc kubenswrapper[4907]: E1129 14:30:56.014502 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:56.51449158 +0000 UTC m=+154.501329222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.014790 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-catalog-content\") pod \"community-operators-tjtqd\" (UID: \"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74\") " pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.041760 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hc9t9" podStartSLOduration=135.041737312 podStartE2EDuration="2m15.041737312s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:56.006056179 +0000 UTC m=+153.992893831" watchObservedRunningTime="2025-11-29 14:30:56.041737312 +0000 UTC m=+154.028574964" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.044718 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g2mls"] Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.081050 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b6710e5-a0a7-43d2-ba6b-62c3187099e7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0b6710e5-a0a7-43d2-ba6b-62c3187099e7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.108864 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99mgl\" (UniqueName: \"kubernetes.io/projected/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-kube-api-access-99mgl\") pod \"community-operators-tjtqd\" (UID: \"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74\") " pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.114975 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.115198 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd4kk\" (UniqueName: \"kubernetes.io/projected/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-kube-api-access-dd4kk\") pod \"certified-operators-g2mls\" (UID: \"10f3989d-c7bb-4c4a-91e1-3b0afaedac98\") " pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.115229 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-utilities\") pod \"certified-operators-g2mls\" (UID: \"10f3989d-c7bb-4c4a-91e1-3b0afaedac98\") " pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.115253 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-catalog-content\") pod \"certified-operators-g2mls\" (UID: \"10f3989d-c7bb-4c4a-91e1-3b0afaedac98\") " pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:30:56 crc kubenswrapper[4907]: E1129 14:30:56.115379 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:56.615363689 +0000 UTC m=+154.602201341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.143925 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.152976 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6z8qw"] Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.153899 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.187274 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6z8qw"] Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.222982 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd4kk\" (UniqueName: \"kubernetes.io/projected/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-kube-api-access-dd4kk\") pod \"certified-operators-g2mls\" (UID: \"10f3989d-c7bb-4c4a-91e1-3b0afaedac98\") " pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.223022 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-utilities\") pod \"certified-operators-g2mls\" (UID: \"10f3989d-c7bb-4c4a-91e1-3b0afaedac98\") " pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.223047 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-catalog-content\") pod \"certified-operators-g2mls\" (UID: \"10f3989d-c7bb-4c4a-91e1-3b0afaedac98\") " pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.223074 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:56 crc kubenswrapper[4907]: E1129 14:30:56.223352 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:56.723340115 +0000 UTC m=+154.710177767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.224033 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-utilities\") pod \"certified-operators-g2mls\" (UID: \"10f3989d-c7bb-4c4a-91e1-3b0afaedac98\") " pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.224087 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-catalog-content\") pod \"certified-operators-g2mls\" (UID: \"10f3989d-c7bb-4c4a-91e1-3b0afaedac98\") " pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.231637 4907 patch_prober.go:28] interesting pod/router-default-5444994796-dwtxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 14:30:56 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Nov 29 14:30:56 crc kubenswrapper[4907]: [+]process-running ok Nov 29 14:30:56 crc kubenswrapper[4907]: healthz check failed Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.231698 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dwtxw" podUID="f45e1ce1-c477-4a87-a3ab-821e702ce490" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.265815 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd4kk\" (UniqueName: \"kubernetes.io/projected/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-kube-api-access-dd4kk\") pod \"certified-operators-g2mls\" (UID: \"10f3989d-c7bb-4c4a-91e1-3b0afaedac98\") " pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.278311 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.307829 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nr4x6"] Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.308986 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.318246 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.323924 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nr4x6"] Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.331937 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.332161 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ncrj\" (UniqueName: \"kubernetes.io/projected/4415eeb4-f833-4731-b210-7193f0b12556-kube-api-access-5ncrj\") pod \"community-operators-6z8qw\" (UID: \"4415eeb4-f833-4731-b210-7193f0b12556\") " pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.332220 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4415eeb4-f833-4731-b210-7193f0b12556-utilities\") pod \"community-operators-6z8qw\" (UID: \"4415eeb4-f833-4731-b210-7193f0b12556\") " pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.332239 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4415eeb4-f833-4731-b210-7193f0b12556-catalog-content\") pod \"community-operators-6z8qw\" (UID: \"4415eeb4-f833-4731-b210-7193f0b12556\") " pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:30:56 crc kubenswrapper[4907]: E1129 14:30:56.332388 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:56.832370644 +0000 UTC m=+154.819208296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.383146 4907 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.433698 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.433772 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1a15057-e639-4f44-970d-2b439ed484e1-utilities\") pod \"certified-operators-nr4x6\" (UID: \"f1a15057-e639-4f44-970d-2b439ed484e1\") " pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.433796 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1a15057-e639-4f44-970d-2b439ed484e1-catalog-content\") pod \"certified-operators-nr4x6\" (UID: \"f1a15057-e639-4f44-970d-2b439ed484e1\") " pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.433840 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ncrj\" (UniqueName: \"kubernetes.io/projected/4415eeb4-f833-4731-b210-7193f0b12556-kube-api-access-5ncrj\") pod \"community-operators-6z8qw\" (UID: \"4415eeb4-f833-4731-b210-7193f0b12556\") " pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.433866 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4415eeb4-f833-4731-b210-7193f0b12556-utilities\") pod \"community-operators-6z8qw\" (UID: \"4415eeb4-f833-4731-b210-7193f0b12556\") " pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.433900 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4415eeb4-f833-4731-b210-7193f0b12556-catalog-content\") pod \"community-operators-6z8qw\" (UID: \"4415eeb4-f833-4731-b210-7193f0b12556\") " pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.433922 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7mh8\" (UniqueName: \"kubernetes.io/projected/f1a15057-e639-4f44-970d-2b439ed484e1-kube-api-access-v7mh8\") pod \"certified-operators-nr4x6\" (UID: \"f1a15057-e639-4f44-970d-2b439ed484e1\") " pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:30:56 crc kubenswrapper[4907]: E1129 14:30:56.434257 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:56.934246093 +0000 UTC m=+154.921083735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.435196 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4415eeb4-f833-4731-b210-7193f0b12556-utilities\") pod \"community-operators-6z8qw\" (UID: \"4415eeb4-f833-4731-b210-7193f0b12556\") " pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.435475 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4415eeb4-f833-4731-b210-7193f0b12556-catalog-content\") pod \"community-operators-6z8qw\" (UID: \"4415eeb4-f833-4731-b210-7193f0b12556\") " pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.460360 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ncrj\" (UniqueName: \"kubernetes.io/projected/4415eeb4-f833-4731-b210-7193f0b12556-kube-api-access-5ncrj\") pod \"community-operators-6z8qw\" (UID: \"4415eeb4-f833-4731-b210-7193f0b12556\") " pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.535850 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wl27q" event={"ID":"be668353-b5df-42f6-bc63-9e896be4f7e7","Type":"ContainerStarted","Data":"fe2aa3026c08918a7dfe3e9e5a117bcd0d3d6d92e5d16af2d34ed9ec5d7130f1"} Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.535900 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wl27q" event={"ID":"be668353-b5df-42f6-bc63-9e896be4f7e7","Type":"ContainerStarted","Data":"2b9858dd7c948469687862a6b36e0111a59829481401ba0bfc48be1b54d9a0c4"} Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.535996 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mvnwh" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.543328 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.543893 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7mh8\" (UniqueName: \"kubernetes.io/projected/f1a15057-e639-4f44-970d-2b439ed484e1-kube-api-access-v7mh8\") pod \"certified-operators-nr4x6\" (UID: \"f1a15057-e639-4f44-970d-2b439ed484e1\") " pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.544096 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1a15057-e639-4f44-970d-2b439ed484e1-utilities\") pod \"certified-operators-nr4x6\" (UID: \"f1a15057-e639-4f44-970d-2b439ed484e1\") " pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.544148 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1a15057-e639-4f44-970d-2b439ed484e1-catalog-content\") pod \"certified-operators-nr4x6\" (UID: \"f1a15057-e639-4f44-970d-2b439ed484e1\") " pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.544791 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1a15057-e639-4f44-970d-2b439ed484e1-catalog-content\") pod \"certified-operators-nr4x6\" (UID: \"f1a15057-e639-4f44-970d-2b439ed484e1\") " pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.552233 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1a15057-e639-4f44-970d-2b439ed484e1-utilities\") pod \"certified-operators-nr4x6\" (UID: \"f1a15057-e639-4f44-970d-2b439ed484e1\") " pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.552519 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:30:56 crc kubenswrapper[4907]: E1129 14:30:56.552693 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 14:30:57.052672061 +0000 UTC m=+155.039509713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.553130 4907 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qtr2r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.553208 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" podUID="2c98c8e5-b9f1-43ca-93f6-cb74695dd076" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.622110 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7mh8\" (UniqueName: \"kubernetes.io/projected/f1a15057-e639-4f44-970d-2b439ed484e1-kube-api-access-v7mh8\") pod \"certified-operators-nr4x6\" (UID: \"f1a15057-e639-4f44-970d-2b439ed484e1\") " pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.641205 4907 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-29T14:30:56.383191739Z","Handler":null,"Name":""} Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.646331 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:56 crc kubenswrapper[4907]: E1129 14:30:56.646704 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 14:30:57.146692392 +0000 UTC m=+155.133530044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-56l8g" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.652508 4907 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.652538 4907 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.654649 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.748180 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.806079 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.852065 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.859662 4907 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.859700 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:56 crc kubenswrapper[4907]: I1129 14:30:56.963084 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-56l8g\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.022648 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tjtqd"] Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.081015 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.097325 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nr4x6"] Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.153981 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.231629 4907 patch_prober.go:28] interesting pod/router-default-5444994796-dwtxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 14:30:57 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Nov 29 14:30:57 crc kubenswrapper[4907]: [+]process-running ok Nov 29 14:30:57 crc kubenswrapper[4907]: healthz check failed Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.231991 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dwtxw" podUID="f45e1ce1-c477-4a87-a3ab-821e702ce490" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.242707 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g2mls"] Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.252036 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6z8qw"] Nov 29 14:30:57 crc kubenswrapper[4907]: W1129 14:30:57.266233 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4415eeb4_f833_4731_b210_7193f0b12556.slice/crio-404e15e1bd8fa0fc022b6e79becf3b3c745d24c4ca8ae788ca953144e8eb01d1 WatchSource:0}: Error finding container 404e15e1bd8fa0fc022b6e79becf3b3c745d24c4ca8ae788ca953144e8eb01d1: Status 404 returned error can't find the container with id 404e15e1bd8fa0fc022b6e79becf3b3c745d24c4ca8ae788ca953144e8eb01d1 Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.461105 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-56l8g"] Nov 29 14:30:57 crc kubenswrapper[4907]: W1129 14:30:57.521463 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2ae2318_a4a0_4a3f_8744_a0e42e5fa50c.slice/crio-ec91088d4c9418375c41e4709bb5f99d3de146cfbb57699dc859d0459e2ceae8 WatchSource:0}: Error finding container ec91088d4c9418375c41e4709bb5f99d3de146cfbb57699dc859d0459e2ceae8: Status 404 returned error can't find the container with id ec91088d4c9418375c41e4709bb5f99d3de146cfbb57699dc859d0459e2ceae8 Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.543530 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2mls" event={"ID":"10f3989d-c7bb-4c4a-91e1-3b0afaedac98","Type":"ContainerStarted","Data":"aa73259d33c33175c9cb1968385086fa74e11d3312b48830b6771da48e02aa5e"} Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.545277 4907 generic.go:334] "Generic (PLEG): container finished" podID="f1a15057-e639-4f44-970d-2b439ed484e1" containerID="f0454d8cc483a141c3c95fea4ed35d290899c171712987e8a8b55b0de5097b86" exitCode=0 Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.545384 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nr4x6" event={"ID":"f1a15057-e639-4f44-970d-2b439ed484e1","Type":"ContainerDied","Data":"f0454d8cc483a141c3c95fea4ed35d290899c171712987e8a8b55b0de5097b86"} Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.545449 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nr4x6" event={"ID":"f1a15057-e639-4f44-970d-2b439ed484e1","Type":"ContainerStarted","Data":"9df01560ebb49a3824a4227871d3427e4913dfe5fdb3ccafd045357105014564"} Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.547266 4907 generic.go:334] "Generic (PLEG): container finished" podID="f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" containerID="268edbeeb474a99a59f4787a16f55b2880316e233d6239150520554e06b79377" exitCode=0 Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.547322 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjtqd" event={"ID":"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74","Type":"ContainerDied","Data":"268edbeeb474a99a59f4787a16f55b2880316e233d6239150520554e06b79377"} Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.547339 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjtqd" event={"ID":"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74","Type":"ContainerStarted","Data":"299f55338ea93b3549a5faa5b765abffd36116e1f1b1774695cb40b0db6811a8"} Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.548796 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.557716 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" event={"ID":"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c","Type":"ContainerStarted","Data":"ec91088d4c9418375c41e4709bb5f99d3de146cfbb57699dc859d0459e2ceae8"} Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.577634 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6z8qw" event={"ID":"4415eeb4-f833-4731-b210-7193f0b12556","Type":"ContainerStarted","Data":"404e15e1bd8fa0fc022b6e79becf3b3c745d24c4ca8ae788ca953144e8eb01d1"} Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.580240 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0b6710e5-a0a7-43d2-ba6b-62c3187099e7","Type":"ContainerStarted","Data":"96736693a35dd06b0cbf9d2408105a90e9743c1ce421eb666465f8bf0b352658"} Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.585808 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wl27q" event={"ID":"be668353-b5df-42f6-bc63-9e896be4f7e7","Type":"ContainerStarted","Data":"9e67247bdb3f9ca16050ef7fbdc18b9b99ef0be11a19ae65c08eca88fabbfdcb"} Nov 29 14:30:57 crc kubenswrapper[4907]: I1129 14:30:57.606003 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-wl27q" podStartSLOduration=10.605980022 podStartE2EDuration="10.605980022s" podCreationTimestamp="2025-11-29 14:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:57.6039539 +0000 UTC m=+155.590791572" watchObservedRunningTime="2025-11-29 14:30:57.605980022 +0000 UTC m=+155.592817674" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.089940 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.096525 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-mtvpt" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.097115 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kkhhq"] Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.098110 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.099854 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.113167 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkhhq"] Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.222236 4907 patch_prober.go:28] interesting pod/router-default-5444994796-dwtxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 14:30:58 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Nov 29 14:30:58 crc kubenswrapper[4907]: [+]process-running ok Nov 29 14:30:58 crc kubenswrapper[4907]: healthz check failed Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.222317 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dwtxw" podUID="f45e1ce1-c477-4a87-a3ab-821e702ce490" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.273817 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-utilities\") pod \"redhat-marketplace-kkhhq\" (UID: \"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337\") " pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.273872 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-catalog-content\") pod \"redhat-marketplace-kkhhq\" (UID: \"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337\") " pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.273917 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwbrj\" (UniqueName: \"kubernetes.io/projected/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-kube-api-access-pwbrj\") pod \"redhat-marketplace-kkhhq\" (UID: \"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337\") " pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.375378 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-utilities\") pod \"redhat-marketplace-kkhhq\" (UID: \"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337\") " pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.375429 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-catalog-content\") pod \"redhat-marketplace-kkhhq\" (UID: \"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337\") " pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.375488 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwbrj\" (UniqueName: \"kubernetes.io/projected/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-kube-api-access-pwbrj\") pod \"redhat-marketplace-kkhhq\" (UID: \"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337\") " pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.375988 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-catalog-content\") pod \"redhat-marketplace-kkhhq\" (UID: \"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337\") " pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.376187 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-utilities\") pod \"redhat-marketplace-kkhhq\" (UID: \"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337\") " pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.404608 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwbrj\" (UniqueName: \"kubernetes.io/projected/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-kube-api-access-pwbrj\") pod \"redhat-marketplace-kkhhq\" (UID: \"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337\") " pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.435199 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.488413 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.491332 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.491383 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.498603 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5jg22"] Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.499805 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.507078 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jg22"] Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.632585 4907 generic.go:334] "Generic (PLEG): container finished" podID="a259660c-b57b-4a89-9f33-19d3bb3f5a93" containerID="602193d41e220ce3274d996b139832df814b3285c233c0710393a7ec24970b82" exitCode=0 Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.632671 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" event={"ID":"a259660c-b57b-4a89-9f33-19d3bb3f5a93","Type":"ContainerDied","Data":"602193d41e220ce3274d996b139832df814b3285c233c0710393a7ec24970b82"} Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.635267 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.639101 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6z8qw" event={"ID":"4415eeb4-f833-4731-b210-7193f0b12556","Type":"ContainerStarted","Data":"ca98c11a5d5284d37efda32dd383dbfdc0970c51531f335e93e8661612dfb555"} Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.643078 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0b6710e5-a0a7-43d2-ba6b-62c3187099e7","Type":"ContainerStarted","Data":"cb955fbd3552c30f9bfc20d92f9a3ec0f1b5ccce4603f01cf208b6ecfcdc8f38"} Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.686038 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e855aa2-07b3-4277-b106-6ba62b287992-catalog-content\") pod \"redhat-marketplace-5jg22\" (UID: \"4e855aa2-07b3-4277-b106-6ba62b287992\") " pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.686339 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e855aa2-07b3-4277-b106-6ba62b287992-utilities\") pod \"redhat-marketplace-5jg22\" (UID: \"4e855aa2-07b3-4277-b106-6ba62b287992\") " pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.686369 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlpbt\" (UniqueName: \"kubernetes.io/projected/4e855aa2-07b3-4277-b106-6ba62b287992-kube-api-access-wlpbt\") pod \"redhat-marketplace-5jg22\" (UID: \"4e855aa2-07b3-4277-b106-6ba62b287992\") " pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.698744 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.698731318 podStartE2EDuration="3.698731318s" podCreationTimestamp="2025-11-29 14:30:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:58.696645104 +0000 UTC m=+156.683482756" watchObservedRunningTime="2025-11-29 14:30:58.698731318 +0000 UTC m=+156.685568970" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.732108 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" podStartSLOduration=137.732088847 podStartE2EDuration="2m17.732088847s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:30:58.730159299 +0000 UTC m=+156.716997001" watchObservedRunningTime="2025-11-29 14:30:58.732088847 +0000 UTC m=+156.718926499" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.749190 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkhhq"] Nov 29 14:30:58 crc kubenswrapper[4907]: W1129 14:30:58.771536 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2b415cf_bdf4_4f9b_8ce8_a75b1a026337.slice/crio-3dfc4ce91b8faa153a6bbc939a698a4b9ca7f89f7dfbe817eeb70ff0a614f86a WatchSource:0}: Error finding container 3dfc4ce91b8faa153a6bbc939a698a4b9ca7f89f7dfbe817eeb70ff0a614f86a: Status 404 returned error can't find the container with id 3dfc4ce91b8faa153a6bbc939a698a4b9ca7f89f7dfbe817eeb70ff0a614f86a Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.787893 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e855aa2-07b3-4277-b106-6ba62b287992-catalog-content\") pod \"redhat-marketplace-5jg22\" (UID: \"4e855aa2-07b3-4277-b106-6ba62b287992\") " pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.787932 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e855aa2-07b3-4277-b106-6ba62b287992-utilities\") pod \"redhat-marketplace-5jg22\" (UID: \"4e855aa2-07b3-4277-b106-6ba62b287992\") " pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.787990 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlpbt\" (UniqueName: \"kubernetes.io/projected/4e855aa2-07b3-4277-b106-6ba62b287992-kube-api-access-wlpbt\") pod \"redhat-marketplace-5jg22\" (UID: \"4e855aa2-07b3-4277-b106-6ba62b287992\") " pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.789381 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e855aa2-07b3-4277-b106-6ba62b287992-utilities\") pod \"redhat-marketplace-5jg22\" (UID: \"4e855aa2-07b3-4277-b106-6ba62b287992\") " pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.789968 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e855aa2-07b3-4277-b106-6ba62b287992-catalog-content\") pod \"redhat-marketplace-5jg22\" (UID: \"4e855aa2-07b3-4277-b106-6ba62b287992\") " pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.817128 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlpbt\" (UniqueName: \"kubernetes.io/projected/4e855aa2-07b3-4277-b106-6ba62b287992-kube-api-access-wlpbt\") pod \"redhat-marketplace-5jg22\" (UID: \"4e855aa2-07b3-4277-b106-6ba62b287992\") " pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.829686 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.906676 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-487xn"] Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.907859 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.910254 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 29 14:30:58 crc kubenswrapper[4907]: I1129 14:30:58.929218 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-487xn"] Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.128047 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns79s\" (UniqueName: \"kubernetes.io/projected/c6b02860-46c7-4498-a162-8e2833deb120-kube-api-access-ns79s\") pod \"redhat-operators-487xn\" (UID: \"c6b02860-46c7-4498-a162-8e2833deb120\") " pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.128577 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6b02860-46c7-4498-a162-8e2833deb120-catalog-content\") pod \"redhat-operators-487xn\" (UID: \"c6b02860-46c7-4498-a162-8e2833deb120\") " pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.128659 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6b02860-46c7-4498-a162-8e2833deb120-utilities\") pod \"redhat-operators-487xn\" (UID: \"c6b02860-46c7-4498-a162-8e2833deb120\") " pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.133520 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cflt7"] Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.137358 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.171532 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cflt7"] Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.207263 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jg22"] Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.222286 4907 patch_prober.go:28] interesting pod/router-default-5444994796-dwtxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 14:30:59 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Nov 29 14:30:59 crc kubenswrapper[4907]: [+]process-running ok Nov 29 14:30:59 crc kubenswrapper[4907]: healthz check failed Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.222341 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dwtxw" podUID="f45e1ce1-c477-4a87-a3ab-821e702ce490" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.231276 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mqcr\" (UniqueName: \"kubernetes.io/projected/03ea4590-5449-4b54-97f5-e09dbc538dfe-kube-api-access-5mqcr\") pod \"redhat-operators-cflt7\" (UID: \"03ea4590-5449-4b54-97f5-e09dbc538dfe\") " pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.231493 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns79s\" (UniqueName: \"kubernetes.io/projected/c6b02860-46c7-4498-a162-8e2833deb120-kube-api-access-ns79s\") pod \"redhat-operators-487xn\" (UID: \"c6b02860-46c7-4498-a162-8e2833deb120\") " pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.231523 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03ea4590-5449-4b54-97f5-e09dbc538dfe-utilities\") pod \"redhat-operators-cflt7\" (UID: \"03ea4590-5449-4b54-97f5-e09dbc538dfe\") " pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.231822 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6b02860-46c7-4498-a162-8e2833deb120-catalog-content\") pod \"redhat-operators-487xn\" (UID: \"c6b02860-46c7-4498-a162-8e2833deb120\") " pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.231982 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03ea4590-5449-4b54-97f5-e09dbc538dfe-catalog-content\") pod \"redhat-operators-cflt7\" (UID: \"03ea4590-5449-4b54-97f5-e09dbc538dfe\") " pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.232022 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6b02860-46c7-4498-a162-8e2833deb120-utilities\") pod \"redhat-operators-487xn\" (UID: \"c6b02860-46c7-4498-a162-8e2833deb120\") " pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.233195 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6b02860-46c7-4498-a162-8e2833deb120-utilities\") pod \"redhat-operators-487xn\" (UID: \"c6b02860-46c7-4498-a162-8e2833deb120\") " pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.233228 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6b02860-46c7-4498-a162-8e2833deb120-catalog-content\") pod \"redhat-operators-487xn\" (UID: \"c6b02860-46c7-4498-a162-8e2833deb120\") " pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.281778 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns79s\" (UniqueName: \"kubernetes.io/projected/c6b02860-46c7-4498-a162-8e2833deb120-kube-api-access-ns79s\") pod \"redhat-operators-487xn\" (UID: \"c6b02860-46c7-4498-a162-8e2833deb120\") " pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:30:59 crc kubenswrapper[4907]: W1129 14:30:59.285657 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e855aa2_07b3_4277_b106_6ba62b287992.slice/crio-ac090139c909f70422d905693cc7f029f7b6e00d1226ce80fc68d5c3230f7e1c WatchSource:0}: Error finding container ac090139c909f70422d905693cc7f029f7b6e00d1226ce80fc68d5c3230f7e1c: Status 404 returned error can't find the container with id ac090139c909f70422d905693cc7f029f7b6e00d1226ce80fc68d5c3230f7e1c Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.337700 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03ea4590-5449-4b54-97f5-e09dbc538dfe-catalog-content\") pod \"redhat-operators-cflt7\" (UID: \"03ea4590-5449-4b54-97f5-e09dbc538dfe\") " pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.337750 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mqcr\" (UniqueName: \"kubernetes.io/projected/03ea4590-5449-4b54-97f5-e09dbc538dfe-kube-api-access-5mqcr\") pod \"redhat-operators-cflt7\" (UID: \"03ea4590-5449-4b54-97f5-e09dbc538dfe\") " pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.337782 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03ea4590-5449-4b54-97f5-e09dbc538dfe-utilities\") pod \"redhat-operators-cflt7\" (UID: \"03ea4590-5449-4b54-97f5-e09dbc538dfe\") " pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.338161 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03ea4590-5449-4b54-97f5-e09dbc538dfe-utilities\") pod \"redhat-operators-cflt7\" (UID: \"03ea4590-5449-4b54-97f5-e09dbc538dfe\") " pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.338551 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03ea4590-5449-4b54-97f5-e09dbc538dfe-catalog-content\") pod \"redhat-operators-cflt7\" (UID: \"03ea4590-5449-4b54-97f5-e09dbc538dfe\") " pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.355351 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mqcr\" (UniqueName: \"kubernetes.io/projected/03ea4590-5449-4b54-97f5-e09dbc538dfe-kube-api-access-5mqcr\") pod \"redhat-operators-cflt7\" (UID: \"03ea4590-5449-4b54-97f5-e09dbc538dfe\") " pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.430323 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.430393 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.435372 4907 patch_prober.go:28] interesting pod/console-f9d7485db-5t44g container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.435420 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-5t44g" podUID="0c70da9a-ff96-432f-81ad-382c70754e70" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.488311 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.526990 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.690932 4907 generic.go:334] "Generic (PLEG): container finished" podID="4415eeb4-f833-4731-b210-7193f0b12556" containerID="ca98c11a5d5284d37efda32dd383dbfdc0970c51531f335e93e8661612dfb555" exitCode=0 Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.691288 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6z8qw" event={"ID":"4415eeb4-f833-4731-b210-7193f0b12556","Type":"ContainerDied","Data":"ca98c11a5d5284d37efda32dd383dbfdc0970c51531f335e93e8661612dfb555"} Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.701231 4907 generic.go:334] "Generic (PLEG): container finished" podID="e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" containerID="74dd0fce1fd2e279d775da1cd97cc604577ed8fa5cde0cfbd7c92d11809380c2" exitCode=0 Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.701344 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkhhq" event={"ID":"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337","Type":"ContainerDied","Data":"74dd0fce1fd2e279d775da1cd97cc604577ed8fa5cde0cfbd7c92d11809380c2"} Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.701519 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkhhq" event={"ID":"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337","Type":"ContainerStarted","Data":"3dfc4ce91b8faa153a6bbc939a698a4b9ca7f89f7dfbe817eeb70ff0a614f86a"} Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.710404 4907 generic.go:334] "Generic (PLEG): container finished" podID="0b6710e5-a0a7-43d2-ba6b-62c3187099e7" containerID="cb955fbd3552c30f9bfc20d92f9a3ec0f1b5ccce4603f01cf208b6ecfcdc8f38" exitCode=0 Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.711259 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0b6710e5-a0a7-43d2-ba6b-62c3187099e7","Type":"ContainerDied","Data":"cb955fbd3552c30f9bfc20d92f9a3ec0f1b5ccce4603f01cf208b6ecfcdc8f38"} Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.716226 4907 generic.go:334] "Generic (PLEG): container finished" podID="4e855aa2-07b3-4277-b106-6ba62b287992" containerID="99eb21ebcdbcf4ff05720457a360eb596e4f8845510302040d85ed447d735a7c" exitCode=0 Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.716297 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jg22" event={"ID":"4e855aa2-07b3-4277-b106-6ba62b287992","Type":"ContainerDied","Data":"99eb21ebcdbcf4ff05720457a360eb596e4f8845510302040d85ed447d735a7c"} Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.716331 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jg22" event={"ID":"4e855aa2-07b3-4277-b106-6ba62b287992","Type":"ContainerStarted","Data":"ac090139c909f70422d905693cc7f029f7b6e00d1226ce80fc68d5c3230f7e1c"} Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.718942 4907 generic.go:334] "Generic (PLEG): container finished" podID="10f3989d-c7bb-4c4a-91e1-3b0afaedac98" containerID="5884aa8a804c39334fa05a337ac2f88d91216c92d698904db83e99644bb6a880" exitCode=0 Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.719001 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2mls" event={"ID":"10f3989d-c7bb-4c4a-91e1-3b0afaedac98","Type":"ContainerDied","Data":"5884aa8a804c39334fa05a337ac2f88d91216c92d698904db83e99644bb6a880"} Nov 29 14:30:59 crc kubenswrapper[4907]: I1129 14:30:59.739317 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" event={"ID":"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c","Type":"ContainerStarted","Data":"22ced13bbac3b2ffbdb4d4b05b8f379e8d6341f81d878aaad6804c8b2f6c3297"} Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.011050 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-487xn"] Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.060133 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-rthjj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.42:8080/\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.060189 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-rthjj" podUID="565df8ea-1d38-4c6d-98a6-d63a58c8df03" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.42:8080/\": dial tcp 10.217.0.42:8080: connect: connection refused" Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.060221 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-rthjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.060280 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rthjj" podUID="565df8ea-1d38-4c6d-98a6-d63a58c8df03" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.42:8080/\": dial tcp 10.217.0.42:8080: connect: connection refused" Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.156836 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cflt7"] Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.216547 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.222069 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.224535 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.372910 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a259660c-b57b-4a89-9f33-19d3bb3f5a93-config-volume\") pod \"a259660c-b57b-4a89-9f33-19d3bb3f5a93\" (UID: \"a259660c-b57b-4a89-9f33-19d3bb3f5a93\") " Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.373039 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql9fm\" (UniqueName: \"kubernetes.io/projected/a259660c-b57b-4a89-9f33-19d3bb3f5a93-kube-api-access-ql9fm\") pod \"a259660c-b57b-4a89-9f33-19d3bb3f5a93\" (UID: \"a259660c-b57b-4a89-9f33-19d3bb3f5a93\") " Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.373182 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a259660c-b57b-4a89-9f33-19d3bb3f5a93-secret-volume\") pod \"a259660c-b57b-4a89-9f33-19d3bb3f5a93\" (UID: \"a259660c-b57b-4a89-9f33-19d3bb3f5a93\") " Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.375547 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a259660c-b57b-4a89-9f33-19d3bb3f5a93-config-volume" (OuterVolumeSpecName: "config-volume") pod "a259660c-b57b-4a89-9f33-19d3bb3f5a93" (UID: "a259660c-b57b-4a89-9f33-19d3bb3f5a93"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.382455 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a259660c-b57b-4a89-9f33-19d3bb3f5a93-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a259660c-b57b-4a89-9f33-19d3bb3f5a93" (UID: "a259660c-b57b-4a89-9f33-19d3bb3f5a93"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.382878 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a259660c-b57b-4a89-9f33-19d3bb3f5a93-kube-api-access-ql9fm" (OuterVolumeSpecName: "kube-api-access-ql9fm") pod "a259660c-b57b-4a89-9f33-19d3bb3f5a93" (UID: "a259660c-b57b-4a89-9f33-19d3bb3f5a93"). InnerVolumeSpecName "kube-api-access-ql9fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.476560 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a259660c-b57b-4a89-9f33-19d3bb3f5a93-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.476607 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql9fm\" (UniqueName: \"kubernetes.io/projected/a259660c-b57b-4a89-9f33-19d3bb3f5a93-kube-api-access-ql9fm\") on node \"crc\" DevicePath \"\"" Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.476619 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a259660c-b57b-4a89-9f33-19d3bb3f5a93-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.540393 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.758260 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" event={"ID":"a259660c-b57b-4a89-9f33-19d3bb3f5a93","Type":"ContainerDied","Data":"811c3c8d26a7c0b94c501a69463410df1338b4cf90ed082e7b04d2c2a9fe7fd3"} Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.758302 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="811c3c8d26a7c0b94c501a69463410df1338b4cf90ed082e7b04d2c2a9fe7fd3" Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.758370 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh" Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.761778 4907 generic.go:334] "Generic (PLEG): container finished" podID="03ea4590-5449-4b54-97f5-e09dbc538dfe" containerID="9cbe195dcc731903eccdd8b18a4dd0803f41e6d21680885b72a09ac0efddeb33" exitCode=0 Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.761968 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cflt7" event={"ID":"03ea4590-5449-4b54-97f5-e09dbc538dfe","Type":"ContainerDied","Data":"9cbe195dcc731903eccdd8b18a4dd0803f41e6d21680885b72a09ac0efddeb33"} Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.761995 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cflt7" event={"ID":"03ea4590-5449-4b54-97f5-e09dbc538dfe","Type":"ContainerStarted","Data":"58ae8e42b2b417d2be5aedd977b9a86439654d7795005d54c305dd45eb54f8ea"} Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.772714 4907 generic.go:334] "Generic (PLEG): container finished" podID="c6b02860-46c7-4498-a162-8e2833deb120" containerID="6d4a6a1a8d5385377b1fa061b394b348bc7def23b211c6fb76ad8d0e0fa6bb53" exitCode=0 Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.772977 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-487xn" event={"ID":"c6b02860-46c7-4498-a162-8e2833deb120","Type":"ContainerDied","Data":"6d4a6a1a8d5385377b1fa061b394b348bc7def23b211c6fb76ad8d0e0fa6bb53"} Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.773031 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-487xn" event={"ID":"c6b02860-46c7-4498-a162-8e2833deb120","Type":"ContainerStarted","Data":"59f50a6032f0d79cf5621464cff6d6d30e6ea33f419263c5e96365a9e1bb934d"} Nov 29 14:31:00 crc kubenswrapper[4907]: I1129 14:31:00.776717 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-dwtxw" Nov 29 14:31:01 crc kubenswrapper[4907]: I1129 14:31:01.181894 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 14:31:01 crc kubenswrapper[4907]: I1129 14:31:01.297247 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b6710e5-a0a7-43d2-ba6b-62c3187099e7-kubelet-dir\") pod \"0b6710e5-a0a7-43d2-ba6b-62c3187099e7\" (UID: \"0b6710e5-a0a7-43d2-ba6b-62c3187099e7\") " Nov 29 14:31:01 crc kubenswrapper[4907]: I1129 14:31:01.297418 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b6710e5-a0a7-43d2-ba6b-62c3187099e7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0b6710e5-a0a7-43d2-ba6b-62c3187099e7" (UID: "0b6710e5-a0a7-43d2-ba6b-62c3187099e7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:31:01 crc kubenswrapper[4907]: I1129 14:31:01.297526 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b6710e5-a0a7-43d2-ba6b-62c3187099e7-kube-api-access\") pod \"0b6710e5-a0a7-43d2-ba6b-62c3187099e7\" (UID: \"0b6710e5-a0a7-43d2-ba6b-62c3187099e7\") " Nov 29 14:31:01 crc kubenswrapper[4907]: I1129 14:31:01.298073 4907 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b6710e5-a0a7-43d2-ba6b-62c3187099e7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 29 14:31:01 crc kubenswrapper[4907]: I1129 14:31:01.308568 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b6710e5-a0a7-43d2-ba6b-62c3187099e7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b6710e5-a0a7-43d2-ba6b-62c3187099e7" (UID: "0b6710e5-a0a7-43d2-ba6b-62c3187099e7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:31:01 crc kubenswrapper[4907]: I1129 14:31:01.401942 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b6710e5-a0a7-43d2-ba6b-62c3187099e7-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 14:31:01 crc kubenswrapper[4907]: I1129 14:31:01.795891 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0b6710e5-a0a7-43d2-ba6b-62c3187099e7","Type":"ContainerDied","Data":"96736693a35dd06b0cbf9d2408105a90e9743c1ce421eb666465f8bf0b352658"} Nov 29 14:31:01 crc kubenswrapper[4907]: I1129 14:31:01.795949 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96736693a35dd06b0cbf9d2408105a90e9743c1ce421eb666465f8bf0b352658" Nov 29 14:31:01 crc kubenswrapper[4907]: I1129 14:31:01.795947 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.219358 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 29 14:31:03 crc kubenswrapper[4907]: E1129 14:31:03.219824 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a259660c-b57b-4a89-9f33-19d3bb3f5a93" containerName="collect-profiles" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.219835 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a259660c-b57b-4a89-9f33-19d3bb3f5a93" containerName="collect-profiles" Nov 29 14:31:03 crc kubenswrapper[4907]: E1129 14:31:03.219851 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b6710e5-a0a7-43d2-ba6b-62c3187099e7" containerName="pruner" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.219859 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b6710e5-a0a7-43d2-ba6b-62c3187099e7" containerName="pruner" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.219947 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a259660c-b57b-4a89-9f33-19d3bb3f5a93" containerName="collect-profiles" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.219960 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b6710e5-a0a7-43d2-ba6b-62c3187099e7" containerName="pruner" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.220286 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.221904 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.222805 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.228308 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.357821 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b28f2b91-7867-4490-9b8a-de6a5d8789e4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b28f2b91-7867-4490-9b8a-de6a5d8789e4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.357908 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b28f2b91-7867-4490-9b8a-de6a5d8789e4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b28f2b91-7867-4490-9b8a-de6a5d8789e4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.459588 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b28f2b91-7867-4490-9b8a-de6a5d8789e4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b28f2b91-7867-4490-9b8a-de6a5d8789e4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.459678 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b28f2b91-7867-4490-9b8a-de6a5d8789e4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b28f2b91-7867-4490-9b8a-de6a5d8789e4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.460432 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b28f2b91-7867-4490-9b8a-de6a5d8789e4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b28f2b91-7867-4490-9b8a-de6a5d8789e4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.478281 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b28f2b91-7867-4490-9b8a-de6a5d8789e4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b28f2b91-7867-4490-9b8a-de6a5d8789e4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.541974 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.764502 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs\") pod \"network-metrics-daemon-25ct5\" (UID: \"9f50e55a-d427-4cde-a639-d6c7597e937a\") " pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.767933 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f50e55a-d427-4cde-a639-d6c7597e937a-metrics-certs\") pod \"network-metrics-daemon-25ct5\" (UID: \"9f50e55a-d427-4cde-a639-d6c7597e937a\") " pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:31:03 crc kubenswrapper[4907]: I1129 14:31:03.827606 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25ct5" Nov 29 14:31:04 crc kubenswrapper[4907]: I1129 14:31:04.095951 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 29 14:31:04 crc kubenswrapper[4907]: W1129 14:31:04.179559 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb28f2b91_7867_4490_9b8a_de6a5d8789e4.slice/crio-678c039b21499cd8808034188c4999f7644413463f8bb1d71189c068baeb8255 WatchSource:0}: Error finding container 678c039b21499cd8808034188c4999f7644413463f8bb1d71189c068baeb8255: Status 404 returned error can't find the container with id 678c039b21499cd8808034188c4999f7644413463f8bb1d71189c068baeb8255 Nov 29 14:31:04 crc kubenswrapper[4907]: I1129 14:31:04.398141 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-25ct5"] Nov 29 14:31:04 crc kubenswrapper[4907]: W1129 14:31:04.427031 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f50e55a_d427_4cde_a639_d6c7597e937a.slice/crio-f19f01c57d0a0b51b4e3c630b89fbf7701538dbe0154c7c54d70a6737cc55207 WatchSource:0}: Error finding container f19f01c57d0a0b51b4e3c630b89fbf7701538dbe0154c7c54d70a6737cc55207: Status 404 returned error can't find the container with id f19f01c57d0a0b51b4e3c630b89fbf7701538dbe0154c7c54d70a6737cc55207 Nov 29 14:31:04 crc kubenswrapper[4907]: I1129 14:31:04.860647 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b28f2b91-7867-4490-9b8a-de6a5d8789e4","Type":"ContainerStarted","Data":"678c039b21499cd8808034188c4999f7644413463f8bb1d71189c068baeb8255"} Nov 29 14:31:04 crc kubenswrapper[4907]: I1129 14:31:04.871158 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-25ct5" event={"ID":"9f50e55a-d427-4cde-a639-d6c7597e937a","Type":"ContainerStarted","Data":"f19f01c57d0a0b51b4e3c630b89fbf7701538dbe0154c7c54d70a6737cc55207"} Nov 29 14:31:05 crc kubenswrapper[4907]: I1129 14:31:05.605900 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jxwmf" Nov 29 14:31:05 crc kubenswrapper[4907]: I1129 14:31:05.656010 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-wl27q" podUID="be668353-b5df-42f6-bc63-9e896be4f7e7" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 14:31:07 crc kubenswrapper[4907]: I1129 14:31:07.899544 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b28f2b91-7867-4490-9b8a-de6a5d8789e4","Type":"ContainerStarted","Data":"da239c0d6d2fbfea4d1e430b30c662e6b0b1f54fb0e9f377902fb09c0ba439c1"} Nov 29 14:31:07 crc kubenswrapper[4907]: I1129 14:31:07.900918 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-25ct5" event={"ID":"9f50e55a-d427-4cde-a639-d6c7597e937a","Type":"ContainerStarted","Data":"b4d9778d525438e90cf817e27d0ba960986732266189ca3b95ebd302e79eb3c1"} Nov 29 14:31:07 crc kubenswrapper[4907]: I1129 14:31:07.922168 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.922150982 podStartE2EDuration="4.922150982s" podCreationTimestamp="2025-11-29 14:31:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:31:07.92063436 +0000 UTC m=+165.907472012" watchObservedRunningTime="2025-11-29 14:31:07.922150982 +0000 UTC m=+165.908988634" Nov 29 14:31:08 crc kubenswrapper[4907]: I1129 14:31:08.908720 4907 generic.go:334] "Generic (PLEG): container finished" podID="b28f2b91-7867-4490-9b8a-de6a5d8789e4" containerID="da239c0d6d2fbfea4d1e430b30c662e6b0b1f54fb0e9f377902fb09c0ba439c1" exitCode=0 Nov 29 14:31:08 crc kubenswrapper[4907]: I1129 14:31:08.908819 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b28f2b91-7867-4490-9b8a-de6a5d8789e4","Type":"ContainerDied","Data":"da239c0d6d2fbfea4d1e430b30c662e6b0b1f54fb0e9f377902fb09c0ba439c1"} Nov 29 14:31:09 crc kubenswrapper[4907]: I1129 14:31:09.430066 4907 patch_prober.go:28] interesting pod/console-f9d7485db-5t44g container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Nov 29 14:31:09 crc kubenswrapper[4907]: I1129 14:31:09.430121 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-5t44g" podUID="0c70da9a-ff96-432f-81ad-382c70754e70" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Nov 29 14:31:10 crc kubenswrapper[4907]: I1129 14:31:10.074935 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-rthjj" Nov 29 14:31:13 crc kubenswrapper[4907]: I1129 14:31:13.953269 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b28f2b91-7867-4490-9b8a-de6a5d8789e4","Type":"ContainerDied","Data":"678c039b21499cd8808034188c4999f7644413463f8bb1d71189c068baeb8255"} Nov 29 14:31:13 crc kubenswrapper[4907]: I1129 14:31:13.954122 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="678c039b21499cd8808034188c4999f7644413463f8bb1d71189c068baeb8255" Nov 29 14:31:13 crc kubenswrapper[4907]: I1129 14:31:13.984998 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 14:31:14 crc kubenswrapper[4907]: I1129 14:31:14.156638 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b28f2b91-7867-4490-9b8a-de6a5d8789e4-kubelet-dir\") pod \"b28f2b91-7867-4490-9b8a-de6a5d8789e4\" (UID: \"b28f2b91-7867-4490-9b8a-de6a5d8789e4\") " Nov 29 14:31:14 crc kubenswrapper[4907]: I1129 14:31:14.156722 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b28f2b91-7867-4490-9b8a-de6a5d8789e4-kube-api-access\") pod \"b28f2b91-7867-4490-9b8a-de6a5d8789e4\" (UID: \"b28f2b91-7867-4490-9b8a-de6a5d8789e4\") " Nov 29 14:31:14 crc kubenswrapper[4907]: I1129 14:31:14.156809 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b28f2b91-7867-4490-9b8a-de6a5d8789e4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b28f2b91-7867-4490-9b8a-de6a5d8789e4" (UID: "b28f2b91-7867-4490-9b8a-de6a5d8789e4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:31:14 crc kubenswrapper[4907]: I1129 14:31:14.157338 4907 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b28f2b91-7867-4490-9b8a-de6a5d8789e4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 29 14:31:14 crc kubenswrapper[4907]: I1129 14:31:14.173544 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b28f2b91-7867-4490-9b8a-de6a5d8789e4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b28f2b91-7867-4490-9b8a-de6a5d8789e4" (UID: "b28f2b91-7867-4490-9b8a-de6a5d8789e4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:31:14 crc kubenswrapper[4907]: I1129 14:31:14.258319 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b28f2b91-7867-4490-9b8a-de6a5d8789e4-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 14:31:14 crc kubenswrapper[4907]: I1129 14:31:14.960839 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 14:31:14 crc kubenswrapper[4907]: I1129 14:31:14.961840 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-25ct5" event={"ID":"9f50e55a-d427-4cde-a639-d6c7597e937a","Type":"ContainerStarted","Data":"88a202b817239e14087abc827541a230bf2865ff789ba735b2f070abd657ea58"} Nov 29 14:31:14 crc kubenswrapper[4907]: I1129 14:31:14.978853 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-25ct5" podStartSLOduration=153.978826852 podStartE2EDuration="2m33.978826852s" podCreationTimestamp="2025-11-29 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:31:14.977248818 +0000 UTC m=+172.964086500" watchObservedRunningTime="2025-11-29 14:31:14.978826852 +0000 UTC m=+172.965664554" Nov 29 14:31:17 crc kubenswrapper[4907]: I1129 14:31:17.093154 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:31:19 crc kubenswrapper[4907]: I1129 14:31:19.436485 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:31:19 crc kubenswrapper[4907]: I1129 14:31:19.445056 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:31:28 crc kubenswrapper[4907]: I1129 14:31:28.490258 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:31:28 crc kubenswrapper[4907]: I1129 14:31:28.490800 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:31:30 crc kubenswrapper[4907]: I1129 14:31:30.571844 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vjb5n" Nov 29 14:31:31 crc kubenswrapper[4907]: E1129 14:31:31.111936 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 29 14:31:31 crc kubenswrapper[4907]: E1129 14:31:31.112112 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v7mh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-nr4x6_openshift-marketplace(f1a15057-e639-4f44-970d-2b439ed484e1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 29 14:31:31 crc kubenswrapper[4907]: E1129 14:31:31.113503 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-nr4x6" podUID="f1a15057-e639-4f44-970d-2b439ed484e1" Nov 29 14:31:31 crc kubenswrapper[4907]: I1129 14:31:31.196648 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 14:31:37 crc kubenswrapper[4907]: E1129 14:31:37.193607 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 29 14:31:37 crc kubenswrapper[4907]: E1129 14:31:37.194315 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dd4kk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-g2mls_openshift-marketplace(10f3989d-c7bb-4c4a-91e1-3b0afaedac98): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 29 14:31:37 crc kubenswrapper[4907]: E1129 14:31:37.195808 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-g2mls" podUID="10f3989d-c7bb-4c4a-91e1-3b0afaedac98" Nov 29 14:31:37 crc kubenswrapper[4907]: E1129 14:31:37.340907 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-nr4x6" podUID="f1a15057-e639-4f44-970d-2b439ed484e1" Nov 29 14:31:38 crc kubenswrapper[4907]: E1129 14:31:38.708847 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 29 14:31:38 crc kubenswrapper[4907]: E1129 14:31:38.708985 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ns79s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-487xn_openshift-marketplace(c6b02860-46c7-4498-a162-8e2833deb120): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 29 14:31:38 crc kubenswrapper[4907]: E1129 14:31:38.710146 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-487xn" podUID="c6b02860-46c7-4498-a162-8e2833deb120" Nov 29 14:31:39 crc kubenswrapper[4907]: E1129 14:31:39.677677 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-g2mls" podUID="10f3989d-c7bb-4c4a-91e1-3b0afaedac98" Nov 29 14:31:39 crc kubenswrapper[4907]: E1129 14:31:39.677718 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-487xn" podUID="c6b02860-46c7-4498-a162-8e2833deb120" Nov 29 14:31:39 crc kubenswrapper[4907]: E1129 14:31:39.788805 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 29 14:31:39 crc kubenswrapper[4907]: E1129 14:31:39.788961 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5mqcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cflt7_openshift-marketplace(03ea4590-5449-4b54-97f5-e09dbc538dfe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 29 14:31:39 crc kubenswrapper[4907]: E1129 14:31:39.790300 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cflt7" podUID="03ea4590-5449-4b54-97f5-e09dbc538dfe" Nov 29 14:31:39 crc kubenswrapper[4907]: I1129 14:31:39.819566 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 29 14:31:39 crc kubenswrapper[4907]: E1129 14:31:39.819777 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28f2b91-7867-4490-9b8a-de6a5d8789e4" containerName="pruner" Nov 29 14:31:39 crc kubenswrapper[4907]: I1129 14:31:39.819791 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28f2b91-7867-4490-9b8a-de6a5d8789e4" containerName="pruner" Nov 29 14:31:39 crc kubenswrapper[4907]: I1129 14:31:39.819896 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b28f2b91-7867-4490-9b8a-de6a5d8789e4" containerName="pruner" Nov 29 14:31:39 crc kubenswrapper[4907]: I1129 14:31:39.820232 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 14:31:39 crc kubenswrapper[4907]: I1129 14:31:39.828843 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 29 14:31:39 crc kubenswrapper[4907]: I1129 14:31:39.830017 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 29 14:31:39 crc kubenswrapper[4907]: I1129 14:31:39.833041 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 29 14:31:39 crc kubenswrapper[4907]: I1129 14:31:39.979426 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3687dd4-0e9e-44fe-bb80-e458b97c047d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c3687dd4-0e9e-44fe-bb80-e458b97c047d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 14:31:39 crc kubenswrapper[4907]: I1129 14:31:39.979516 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3687dd4-0e9e-44fe-bb80-e458b97c047d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c3687dd4-0e9e-44fe-bb80-e458b97c047d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 14:31:40 crc kubenswrapper[4907]: I1129 14:31:40.081080 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3687dd4-0e9e-44fe-bb80-e458b97c047d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c3687dd4-0e9e-44fe-bb80-e458b97c047d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 14:31:40 crc kubenswrapper[4907]: I1129 14:31:40.081152 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3687dd4-0e9e-44fe-bb80-e458b97c047d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c3687dd4-0e9e-44fe-bb80-e458b97c047d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 14:31:40 crc kubenswrapper[4907]: I1129 14:31:40.081253 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3687dd4-0e9e-44fe-bb80-e458b97c047d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c3687dd4-0e9e-44fe-bb80-e458b97c047d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 14:31:40 crc kubenswrapper[4907]: I1129 14:31:40.111817 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3687dd4-0e9e-44fe-bb80-e458b97c047d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c3687dd4-0e9e-44fe-bb80-e458b97c047d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 14:31:40 crc kubenswrapper[4907]: I1129 14:31:40.165350 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 14:31:41 crc kubenswrapper[4907]: E1129 14:31:41.531746 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 29 14:31:41 crc kubenswrapper[4907]: E1129 14:31:41.532109 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pwbrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kkhhq_openshift-marketplace(e2b415cf-bdf4-4f9b-8ce8-a75b1a026337): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 29 14:31:41 crc kubenswrapper[4907]: E1129 14:31:41.534393 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-kkhhq" podUID="e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" Nov 29 14:31:41 crc kubenswrapper[4907]: E1129 14:31:41.948727 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cflt7" podUID="03ea4590-5449-4b54-97f5-e09dbc538dfe" Nov 29 14:31:41 crc kubenswrapper[4907]: E1129 14:31:41.993327 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 29 14:31:41 crc kubenswrapper[4907]: E1129 14:31:41.993587 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wlpbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5jg22_openshift-marketplace(4e855aa2-07b3-4277-b106-6ba62b287992): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 29 14:31:41 crc kubenswrapper[4907]: E1129 14:31:41.994799 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5jg22" podUID="4e855aa2-07b3-4277-b106-6ba62b287992" Nov 29 14:31:42 crc kubenswrapper[4907]: E1129 14:31:42.149071 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kkhhq" podUID="e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" Nov 29 14:31:42 crc kubenswrapper[4907]: E1129 14:31:42.149085 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5jg22" podUID="4e855aa2-07b3-4277-b106-6ba62b287992" Nov 29 14:31:42 crc kubenswrapper[4907]: I1129 14:31:42.208375 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 29 14:31:42 crc kubenswrapper[4907]: W1129 14:31:42.222533 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc3687dd4_0e9e_44fe_bb80_e458b97c047d.slice/crio-f5dc254811ad2324d6d673bf6c83766ca38e3daa4f78a5c563b64ef480e3190a WatchSource:0}: Error finding container f5dc254811ad2324d6d673bf6c83766ca38e3daa4f78a5c563b64ef480e3190a: Status 404 returned error can't find the container with id f5dc254811ad2324d6d673bf6c83766ca38e3daa4f78a5c563b64ef480e3190a Nov 29 14:31:42 crc kubenswrapper[4907]: E1129 14:31:42.288859 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 29 14:31:42 crc kubenswrapper[4907]: E1129 14:31:42.289036 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-99mgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-tjtqd_openshift-marketplace(f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 29 14:31:42 crc kubenswrapper[4907]: E1129 14:31:42.290429 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-tjtqd" podUID="f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" Nov 29 14:31:43 crc kubenswrapper[4907]: I1129 14:31:43.152725 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c3687dd4-0e9e-44fe-bb80-e458b97c047d","Type":"ContainerStarted","Data":"f5dc254811ad2324d6d673bf6c83766ca38e3daa4f78a5c563b64ef480e3190a"} Nov 29 14:31:43 crc kubenswrapper[4907]: E1129 14:31:43.155846 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-tjtqd" podUID="f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" Nov 29 14:31:46 crc kubenswrapper[4907]: I1129 14:31:46.022210 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 29 14:31:46 crc kubenswrapper[4907]: I1129 14:31:46.024192 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 29 14:31:46 crc kubenswrapper[4907]: I1129 14:31:46.034062 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 29 14:31:46 crc kubenswrapper[4907]: I1129 14:31:46.073341 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-kubelet-dir\") pod \"installer-9-crc\" (UID: \"9ee00348-f65d-4dd4-ba5d-420b395cb2a0\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 14:31:46 crc kubenswrapper[4907]: I1129 14:31:46.073413 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-var-lock\") pod \"installer-9-crc\" (UID: \"9ee00348-f65d-4dd4-ba5d-420b395cb2a0\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 14:31:46 crc kubenswrapper[4907]: I1129 14:31:46.073461 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-kube-api-access\") pod \"installer-9-crc\" (UID: \"9ee00348-f65d-4dd4-ba5d-420b395cb2a0\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 14:31:46 crc kubenswrapper[4907]: I1129 14:31:46.174554 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-var-lock\") pod \"installer-9-crc\" (UID: \"9ee00348-f65d-4dd4-ba5d-420b395cb2a0\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 14:31:46 crc kubenswrapper[4907]: I1129 14:31:46.174600 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-kube-api-access\") pod \"installer-9-crc\" (UID: \"9ee00348-f65d-4dd4-ba5d-420b395cb2a0\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 14:31:46 crc kubenswrapper[4907]: I1129 14:31:46.174694 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-kubelet-dir\") pod \"installer-9-crc\" (UID: \"9ee00348-f65d-4dd4-ba5d-420b395cb2a0\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 14:31:46 crc kubenswrapper[4907]: I1129 14:31:46.174739 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-var-lock\") pod \"installer-9-crc\" (UID: \"9ee00348-f65d-4dd4-ba5d-420b395cb2a0\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 14:31:46 crc kubenswrapper[4907]: I1129 14:31:46.174762 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-kubelet-dir\") pod \"installer-9-crc\" (UID: \"9ee00348-f65d-4dd4-ba5d-420b395cb2a0\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 14:31:46 crc kubenswrapper[4907]: I1129 14:31:46.201337 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-kube-api-access\") pod \"installer-9-crc\" (UID: \"9ee00348-f65d-4dd4-ba5d-420b395cb2a0\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 14:31:46 crc kubenswrapper[4907]: I1129 14:31:46.342865 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 29 14:31:46 crc kubenswrapper[4907]: I1129 14:31:46.765295 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 29 14:31:47 crc kubenswrapper[4907]: W1129 14:31:47.070710 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9ee00348_f65d_4dd4_ba5d_420b395cb2a0.slice/crio-7d417c7bceb67952edf6043deab632e6699b3b7ffb0addc6fcf947229397a907 WatchSource:0}: Error finding container 7d417c7bceb67952edf6043deab632e6699b3b7ffb0addc6fcf947229397a907: Status 404 returned error can't find the container with id 7d417c7bceb67952edf6043deab632e6699b3b7ffb0addc6fcf947229397a907 Nov 29 14:31:47 crc kubenswrapper[4907]: I1129 14:31:47.180762 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9ee00348-f65d-4dd4-ba5d-420b395cb2a0","Type":"ContainerStarted","Data":"7d417c7bceb67952edf6043deab632e6699b3b7ffb0addc6fcf947229397a907"} Nov 29 14:31:51 crc kubenswrapper[4907]: I1129 14:31:51.225130 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c3687dd4-0e9e-44fe-bb80-e458b97c047d","Type":"ContainerStarted","Data":"eeff8f81bea5d93b0f6a41a37bcc68ef50748bee4b3272f51ec620739fe46023"} Nov 29 14:31:58 crc kubenswrapper[4907]: E1129 14:31:58.480627 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 29 14:31:58 crc kubenswrapper[4907]: E1129 14:31:58.481320 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5ncrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-6z8qw_openshift-marketplace(4415eeb4-f833-4731-b210-7193f0b12556): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 29 14:31:58 crc kubenswrapper[4907]: E1129 14:31:58.482588 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-6z8qw" podUID="4415eeb4-f833-4731-b210-7193f0b12556" Nov 29 14:31:58 crc kubenswrapper[4907]: I1129 14:31:58.490616 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:31:58 crc kubenswrapper[4907]: I1129 14:31:58.490726 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:31:58 crc kubenswrapper[4907]: I1129 14:31:58.492179 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:31:58 crc kubenswrapper[4907]: I1129 14:31:58.493408 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 14:31:58 crc kubenswrapper[4907]: I1129 14:31:58.493570 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f" gracePeriod=600 Nov 29 14:31:59 crc kubenswrapper[4907]: I1129 14:31:59.280397 4907 generic.go:334] "Generic (PLEG): container finished" podID="c3687dd4-0e9e-44fe-bb80-e458b97c047d" containerID="eeff8f81bea5d93b0f6a41a37bcc68ef50748bee4b3272f51ec620739fe46023" exitCode=0 Nov 29 14:31:59 crc kubenswrapper[4907]: I1129 14:31:59.281172 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c3687dd4-0e9e-44fe-bb80-e458b97c047d","Type":"ContainerDied","Data":"eeff8f81bea5d93b0f6a41a37bcc68ef50748bee4b3272f51ec620739fe46023"} Nov 29 14:31:59 crc kubenswrapper[4907]: I1129 14:31:59.285083 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f" exitCode=0 Nov 29 14:31:59 crc kubenswrapper[4907]: I1129 14:31:59.285181 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f"} Nov 29 14:31:59 crc kubenswrapper[4907]: I1129 14:31:59.285291 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"a456b93cdbff1001e9ff31e71e560207b63f5cbe6f442049caf8634aa78242ee"} Nov 29 14:31:59 crc kubenswrapper[4907]: I1129 14:31:59.287420 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9ee00348-f65d-4dd4-ba5d-420b395cb2a0","Type":"ContainerStarted","Data":"cbb4913b5d8b66bd6d98668f52efb2ad865e38ea1201c0046c67e804e866eac5"} Nov 29 14:31:59 crc kubenswrapper[4907]: E1129 14:31:59.289110 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-6z8qw" podUID="4415eeb4-f833-4731-b210-7193f0b12556" Nov 29 14:31:59 crc kubenswrapper[4907]: I1129 14:31:59.357511 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=13.357495048 podStartE2EDuration="13.357495048s" podCreationTimestamp="2025-11-29 14:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:31:59.354564798 +0000 UTC m=+217.341402450" watchObservedRunningTime="2025-11-29 14:31:59.357495048 +0000 UTC m=+217.344332700" Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.295918 4907 generic.go:334] "Generic (PLEG): container finished" podID="4e855aa2-07b3-4277-b106-6ba62b287992" containerID="0d0061ecfe623d4094ef436921d73e91d18e65f2047a290305347c7dabc0862f" exitCode=0 Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.296200 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jg22" event={"ID":"4e855aa2-07b3-4277-b106-6ba62b287992","Type":"ContainerDied","Data":"0d0061ecfe623d4094ef436921d73e91d18e65f2047a290305347c7dabc0862f"} Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.300704 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-487xn" event={"ID":"c6b02860-46c7-4498-a162-8e2833deb120","Type":"ContainerStarted","Data":"6703a43a7fee3c9cfc67b8051cb40787ae580513b9d59a6a00ed61f0ce0445f9"} Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.303425 4907 generic.go:334] "Generic (PLEG): container finished" podID="10f3989d-c7bb-4c4a-91e1-3b0afaedac98" containerID="1f8e1a520036ab4af7df334546855c04cdc64b253dbbc502ea0ddf054383f0d3" exitCode=0 Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.303504 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2mls" event={"ID":"10f3989d-c7bb-4c4a-91e1-3b0afaedac98","Type":"ContainerDied","Data":"1f8e1a520036ab4af7df334546855c04cdc64b253dbbc502ea0ddf054383f0d3"} Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.309594 4907 generic.go:334] "Generic (PLEG): container finished" podID="f1a15057-e639-4f44-970d-2b439ed484e1" containerID="4c9dc0bb728de2bc0023cc308a20378a8d0446633c09a915b064f9504877b24d" exitCode=0 Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.309692 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nr4x6" event={"ID":"f1a15057-e639-4f44-970d-2b439ed484e1","Type":"ContainerDied","Data":"4c9dc0bb728de2bc0023cc308a20378a8d0446633c09a915b064f9504877b24d"} Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.320016 4907 generic.go:334] "Generic (PLEG): container finished" podID="f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" containerID="9471866d2d5c1f4231f96f822a0a096b5b92fd21a804ac8b60ee08a2a9740b47" exitCode=0 Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.320096 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjtqd" event={"ID":"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74","Type":"ContainerDied","Data":"9471866d2d5c1f4231f96f822a0a096b5b92fd21a804ac8b60ee08a2a9740b47"} Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.327239 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cflt7" event={"ID":"03ea4590-5449-4b54-97f5-e09dbc538dfe","Type":"ContainerStarted","Data":"b370737ede49ad435e5c3e11968e7ab65d1fba6b794aa189d51b374cadf1d3f4"} Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.329507 4907 generic.go:334] "Generic (PLEG): container finished" podID="e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" containerID="41ddbae1c1da75b1d93a17336cd6ef88c11ebf045f6fb2fa7d07d244fdf3746f" exitCode=0 Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.329663 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkhhq" event={"ID":"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337","Type":"ContainerDied","Data":"41ddbae1c1da75b1d93a17336cd6ef88c11ebf045f6fb2fa7d07d244fdf3746f"} Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.568850 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.616286 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3687dd4-0e9e-44fe-bb80-e458b97c047d-kube-api-access\") pod \"c3687dd4-0e9e-44fe-bb80-e458b97c047d\" (UID: \"c3687dd4-0e9e-44fe-bb80-e458b97c047d\") " Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.616965 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3687dd4-0e9e-44fe-bb80-e458b97c047d-kubelet-dir\") pod \"c3687dd4-0e9e-44fe-bb80-e458b97c047d\" (UID: \"c3687dd4-0e9e-44fe-bb80-e458b97c047d\") " Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.617313 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3687dd4-0e9e-44fe-bb80-e458b97c047d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c3687dd4-0e9e-44fe-bb80-e458b97c047d" (UID: "c3687dd4-0e9e-44fe-bb80-e458b97c047d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.623909 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3687dd4-0e9e-44fe-bb80-e458b97c047d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c3687dd4-0e9e-44fe-bb80-e458b97c047d" (UID: "c3687dd4-0e9e-44fe-bb80-e458b97c047d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.718685 4907 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3687dd4-0e9e-44fe-bb80-e458b97c047d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:00 crc kubenswrapper[4907]: I1129 14:32:00.718727 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3687dd4-0e9e-44fe-bb80-e458b97c047d-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.343071 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c3687dd4-0e9e-44fe-bb80-e458b97c047d","Type":"ContainerDied","Data":"f5dc254811ad2324d6d673bf6c83766ca38e3daa4f78a5c563b64ef480e3190a"} Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.343402 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5dc254811ad2324d6d673bf6c83766ca38e3daa4f78a5c563b64ef480e3190a" Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.343161 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.345568 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nr4x6" event={"ID":"f1a15057-e639-4f44-970d-2b439ed484e1","Type":"ContainerStarted","Data":"c80d2226dccda9fbdffebaa81182fdfe36bc0c4e4b0b6297cc2ac9982e5258ad"} Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.348033 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjtqd" event={"ID":"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74","Type":"ContainerStarted","Data":"618c711bbb2cb49384e0c0b40678b60b2e5d32e4ff47e43bff97dad1479cb56f"} Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.350743 4907 generic.go:334] "Generic (PLEG): container finished" podID="03ea4590-5449-4b54-97f5-e09dbc538dfe" containerID="b370737ede49ad435e5c3e11968e7ab65d1fba6b794aa189d51b374cadf1d3f4" exitCode=0 Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.350810 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cflt7" event={"ID":"03ea4590-5449-4b54-97f5-e09dbc538dfe","Type":"ContainerDied","Data":"b370737ede49ad435e5c3e11968e7ab65d1fba6b794aa189d51b374cadf1d3f4"} Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.357459 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkhhq" event={"ID":"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337","Type":"ContainerStarted","Data":"e62a45ae586078083ec6fc30cdcd364d54c008d7312d7eb7da5bfbecf5d97b89"} Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.361904 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jg22" event={"ID":"4e855aa2-07b3-4277-b106-6ba62b287992","Type":"ContainerStarted","Data":"2d852bce9b42357dac90b956f7c3cc869e9da1226a0419b99c652373f60c8b9e"} Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.365078 4907 generic.go:334] "Generic (PLEG): container finished" podID="c6b02860-46c7-4498-a162-8e2833deb120" containerID="6703a43a7fee3c9cfc67b8051cb40787ae580513b9d59a6a00ed61f0ce0445f9" exitCode=0 Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.365165 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-487xn" event={"ID":"c6b02860-46c7-4498-a162-8e2833deb120","Type":"ContainerDied","Data":"6703a43a7fee3c9cfc67b8051cb40787ae580513b9d59a6a00ed61f0ce0445f9"} Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.369605 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nr4x6" podStartSLOduration=2.207857049 podStartE2EDuration="1m5.369591996s" podCreationTimestamp="2025-11-29 14:30:56 +0000 UTC" firstStartedPulling="2025-11-29 14:30:57.558077465 +0000 UTC m=+155.544915117" lastFinishedPulling="2025-11-29 14:32:00.719812412 +0000 UTC m=+218.706650064" observedRunningTime="2025-11-29 14:32:01.367986047 +0000 UTC m=+219.354823689" watchObservedRunningTime="2025-11-29 14:32:01.369591996 +0000 UTC m=+219.356429648" Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.371519 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2mls" event={"ID":"10f3989d-c7bb-4c4a-91e1-3b0afaedac98","Type":"ContainerStarted","Data":"0ad995ca53cf80aa49f06846b9b65f43df6b584652576b05a5ae26c2ac69ae8f"} Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.386853 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kkhhq" podStartSLOduration=2.278654977 podStartE2EDuration="1m3.386833304s" podCreationTimestamp="2025-11-29 14:30:58 +0000 UTC" firstStartedPulling="2025-11-29 14:30:59.704627903 +0000 UTC m=+157.691465555" lastFinishedPulling="2025-11-29 14:32:00.81280621 +0000 UTC m=+218.799643882" observedRunningTime="2025-11-29 14:32:01.385325908 +0000 UTC m=+219.372163560" watchObservedRunningTime="2025-11-29 14:32:01.386833304 +0000 UTC m=+219.373670956" Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.418937 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5jg22" podStartSLOduration=2.15482759 podStartE2EDuration="1m3.418914597s" podCreationTimestamp="2025-11-29 14:30:58 +0000 UTC" firstStartedPulling="2025-11-29 14:30:59.717491523 +0000 UTC m=+157.704329175" lastFinishedPulling="2025-11-29 14:32:00.98157853 +0000 UTC m=+218.968416182" observedRunningTime="2025-11-29 14:32:01.417231866 +0000 UTC m=+219.404069518" watchObservedRunningTime="2025-11-29 14:32:01.418914597 +0000 UTC m=+219.405752249" Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.488889 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g2mls" podStartSLOduration=5.441757656 podStartE2EDuration="1m6.48886865s" podCreationTimestamp="2025-11-29 14:30:55 +0000 UTC" firstStartedPulling="2025-11-29 14:30:59.728375833 +0000 UTC m=+157.715213485" lastFinishedPulling="2025-11-29 14:32:00.775486827 +0000 UTC m=+218.762324479" observedRunningTime="2025-11-29 14:32:01.48493667 +0000 UTC m=+219.471774322" watchObservedRunningTime="2025-11-29 14:32:01.48886865 +0000 UTC m=+219.475706302" Nov 29 14:32:01 crc kubenswrapper[4907]: I1129 14:32:01.489562 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tjtqd" podStartSLOduration=3.2109346 podStartE2EDuration="1m6.489558511s" podCreationTimestamp="2025-11-29 14:30:55 +0000 UTC" firstStartedPulling="2025-11-29 14:30:57.54854742 +0000 UTC m=+155.535385072" lastFinishedPulling="2025-11-29 14:32:00.827171311 +0000 UTC m=+218.814008983" observedRunningTime="2025-11-29 14:32:01.462801582 +0000 UTC m=+219.449639244" watchObservedRunningTime="2025-11-29 14:32:01.489558511 +0000 UTC m=+219.476396163" Nov 29 14:32:02 crc kubenswrapper[4907]: I1129 14:32:02.379854 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cflt7" event={"ID":"03ea4590-5449-4b54-97f5-e09dbc538dfe","Type":"ContainerStarted","Data":"b846ee36c9aa4a39aef68dd882d90a8a9687df90aa4c908220a12b8da2abc08f"} Nov 29 14:32:02 crc kubenswrapper[4907]: I1129 14:32:02.383295 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-487xn" event={"ID":"c6b02860-46c7-4498-a162-8e2833deb120","Type":"ContainerStarted","Data":"6f1500f01cf386bba81cf6b1299c09386cc15113e54e78d12cf9964497a7e3fc"} Nov 29 14:32:02 crc kubenswrapper[4907]: I1129 14:32:02.423617 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-487xn" podStartSLOduration=3.381790993 podStartE2EDuration="1m4.423594465s" podCreationTimestamp="2025-11-29 14:30:58 +0000 UTC" firstStartedPulling="2025-11-29 14:31:00.77812299 +0000 UTC m=+158.764960652" lastFinishedPulling="2025-11-29 14:32:01.819926472 +0000 UTC m=+219.806764124" observedRunningTime="2025-11-29 14:32:02.419569872 +0000 UTC m=+220.406407524" watchObservedRunningTime="2025-11-29 14:32:02.423594465 +0000 UTC m=+220.410432117" Nov 29 14:32:02 crc kubenswrapper[4907]: I1129 14:32:02.427680 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cflt7" podStartSLOduration=2.420904432 podStartE2EDuration="1m3.42766816s" podCreationTimestamp="2025-11-29 14:30:59 +0000 UTC" firstStartedPulling="2025-11-29 14:31:00.778147561 +0000 UTC m=+158.764985213" lastFinishedPulling="2025-11-29 14:32:01.784911289 +0000 UTC m=+219.771748941" observedRunningTime="2025-11-29 14:32:02.403230601 +0000 UTC m=+220.390068253" watchObservedRunningTime="2025-11-29 14:32:02.42766816 +0000 UTC m=+220.414505822" Nov 29 14:32:06 crc kubenswrapper[4907]: I1129 14:32:06.144336 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:32:06 crc kubenswrapper[4907]: I1129 14:32:06.145214 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:32:06 crc kubenswrapper[4907]: I1129 14:32:06.279301 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:32:06 crc kubenswrapper[4907]: I1129 14:32:06.279586 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:32:06 crc kubenswrapper[4907]: I1129 14:32:06.655865 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:32:06 crc kubenswrapper[4907]: I1129 14:32:06.655912 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:32:06 crc kubenswrapper[4907]: I1129 14:32:06.687869 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:32:06 crc kubenswrapper[4907]: I1129 14:32:06.690959 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:32:06 crc kubenswrapper[4907]: I1129 14:32:06.731189 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:32:06 crc kubenswrapper[4907]: I1129 14:32:06.753394 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:32:07 crc kubenswrapper[4907]: I1129 14:32:07.033973 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:32:07 crc kubenswrapper[4907]: I1129 14:32:07.467994 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:32:08 crc kubenswrapper[4907]: I1129 14:32:08.436058 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:32:08 crc kubenswrapper[4907]: I1129 14:32:08.437016 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:32:08 crc kubenswrapper[4907]: I1129 14:32:08.504820 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:32:08 crc kubenswrapper[4907]: I1129 14:32:08.529621 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nr4x6"] Nov 29 14:32:08 crc kubenswrapper[4907]: I1129 14:32:08.830580 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:32:08 crc kubenswrapper[4907]: I1129 14:32:08.831039 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:32:08 crc kubenswrapper[4907]: I1129 14:32:08.888745 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.434506 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nr4x6" podUID="f1a15057-e639-4f44-970d-2b439ed484e1" containerName="registry-server" containerID="cri-o://c80d2226dccda9fbdffebaa81182fdfe36bc0c4e4b0b6297cc2ac9982e5258ad" gracePeriod=2 Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.490558 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.490866 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.505530 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.512022 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.527680 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.527774 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.566533 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.627301 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.824586 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.948817 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7mh8\" (UniqueName: \"kubernetes.io/projected/f1a15057-e639-4f44-970d-2b439ed484e1-kube-api-access-v7mh8\") pod \"f1a15057-e639-4f44-970d-2b439ed484e1\" (UID: \"f1a15057-e639-4f44-970d-2b439ed484e1\") " Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.948912 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1a15057-e639-4f44-970d-2b439ed484e1-catalog-content\") pod \"f1a15057-e639-4f44-970d-2b439ed484e1\" (UID: \"f1a15057-e639-4f44-970d-2b439ed484e1\") " Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.949058 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1a15057-e639-4f44-970d-2b439ed484e1-utilities\") pod \"f1a15057-e639-4f44-970d-2b439ed484e1\" (UID: \"f1a15057-e639-4f44-970d-2b439ed484e1\") " Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.950563 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a15057-e639-4f44-970d-2b439ed484e1-utilities" (OuterVolumeSpecName: "utilities") pod "f1a15057-e639-4f44-970d-2b439ed484e1" (UID: "f1a15057-e639-4f44-970d-2b439ed484e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.957607 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a15057-e639-4f44-970d-2b439ed484e1-kube-api-access-v7mh8" (OuterVolumeSpecName: "kube-api-access-v7mh8") pod "f1a15057-e639-4f44-970d-2b439ed484e1" (UID: "f1a15057-e639-4f44-970d-2b439ed484e1"). InnerVolumeSpecName "kube-api-access-v7mh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:32:09 crc kubenswrapper[4907]: I1129 14:32:09.997482 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a15057-e639-4f44-970d-2b439ed484e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1a15057-e639-4f44-970d-2b439ed484e1" (UID: "f1a15057-e639-4f44-970d-2b439ed484e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.050727 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7mh8\" (UniqueName: \"kubernetes.io/projected/f1a15057-e639-4f44-970d-2b439ed484e1-kube-api-access-v7mh8\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.050763 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1a15057-e639-4f44-970d-2b439ed484e1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.050773 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1a15057-e639-4f44-970d-2b439ed484e1-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.444950 4907 generic.go:334] "Generic (PLEG): container finished" podID="f1a15057-e639-4f44-970d-2b439ed484e1" containerID="c80d2226dccda9fbdffebaa81182fdfe36bc0c4e4b0b6297cc2ac9982e5258ad" exitCode=0 Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.445033 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nr4x6" Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.445098 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nr4x6" event={"ID":"f1a15057-e639-4f44-970d-2b439ed484e1","Type":"ContainerDied","Data":"c80d2226dccda9fbdffebaa81182fdfe36bc0c4e4b0b6297cc2ac9982e5258ad"} Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.445150 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nr4x6" event={"ID":"f1a15057-e639-4f44-970d-2b439ed484e1","Type":"ContainerDied","Data":"9df01560ebb49a3824a4227871d3427e4913dfe5fdb3ccafd045357105014564"} Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.445178 4907 scope.go:117] "RemoveContainer" containerID="c80d2226dccda9fbdffebaa81182fdfe36bc0c4e4b0b6297cc2ac9982e5258ad" Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.472424 4907 scope.go:117] "RemoveContainer" containerID="4c9dc0bb728de2bc0023cc308a20378a8d0446633c09a915b064f9504877b24d" Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.502521 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nr4x6"] Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.502579 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nr4x6"] Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.516547 4907 scope.go:117] "RemoveContainer" containerID="f0454d8cc483a141c3c95fea4ed35d290899c171712987e8a8b55b0de5097b86" Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.518830 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.520609 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.551092 4907 scope.go:117] "RemoveContainer" containerID="c80d2226dccda9fbdffebaa81182fdfe36bc0c4e4b0b6297cc2ac9982e5258ad" Nov 29 14:32:10 crc kubenswrapper[4907]: E1129 14:32:10.551845 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c80d2226dccda9fbdffebaa81182fdfe36bc0c4e4b0b6297cc2ac9982e5258ad\": container with ID starting with c80d2226dccda9fbdffebaa81182fdfe36bc0c4e4b0b6297cc2ac9982e5258ad not found: ID does not exist" containerID="c80d2226dccda9fbdffebaa81182fdfe36bc0c4e4b0b6297cc2ac9982e5258ad" Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.551901 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80d2226dccda9fbdffebaa81182fdfe36bc0c4e4b0b6297cc2ac9982e5258ad"} err="failed to get container status \"c80d2226dccda9fbdffebaa81182fdfe36bc0c4e4b0b6297cc2ac9982e5258ad\": rpc error: code = NotFound desc = could not find container \"c80d2226dccda9fbdffebaa81182fdfe36bc0c4e4b0b6297cc2ac9982e5258ad\": container with ID starting with c80d2226dccda9fbdffebaa81182fdfe36bc0c4e4b0b6297cc2ac9982e5258ad not found: ID does not exist" Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.551939 4907 scope.go:117] "RemoveContainer" containerID="4c9dc0bb728de2bc0023cc308a20378a8d0446633c09a915b064f9504877b24d" Nov 29 14:32:10 crc kubenswrapper[4907]: E1129 14:32:10.552710 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c9dc0bb728de2bc0023cc308a20378a8d0446633c09a915b064f9504877b24d\": container with ID starting with 4c9dc0bb728de2bc0023cc308a20378a8d0446633c09a915b064f9504877b24d not found: ID does not exist" containerID="4c9dc0bb728de2bc0023cc308a20378a8d0446633c09a915b064f9504877b24d" Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.552742 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c9dc0bb728de2bc0023cc308a20378a8d0446633c09a915b064f9504877b24d"} err="failed to get container status \"4c9dc0bb728de2bc0023cc308a20378a8d0446633c09a915b064f9504877b24d\": rpc error: code = NotFound desc = could not find container \"4c9dc0bb728de2bc0023cc308a20378a8d0446633c09a915b064f9504877b24d\": container with ID starting with 4c9dc0bb728de2bc0023cc308a20378a8d0446633c09a915b064f9504877b24d not found: ID does not exist" Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.552766 4907 scope.go:117] "RemoveContainer" containerID="f0454d8cc483a141c3c95fea4ed35d290899c171712987e8a8b55b0de5097b86" Nov 29 14:32:10 crc kubenswrapper[4907]: E1129 14:32:10.553542 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0454d8cc483a141c3c95fea4ed35d290899c171712987e8a8b55b0de5097b86\": container with ID starting with f0454d8cc483a141c3c95fea4ed35d290899c171712987e8a8b55b0de5097b86 not found: ID does not exist" containerID="f0454d8cc483a141c3c95fea4ed35d290899c171712987e8a8b55b0de5097b86" Nov 29 14:32:10 crc kubenswrapper[4907]: I1129 14:32:10.553684 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0454d8cc483a141c3c95fea4ed35d290899c171712987e8a8b55b0de5097b86"} err="failed to get container status \"f0454d8cc483a141c3c95fea4ed35d290899c171712987e8a8b55b0de5097b86\": rpc error: code = NotFound desc = could not find container \"f0454d8cc483a141c3c95fea4ed35d290899c171712987e8a8b55b0de5097b86\": container with ID starting with f0454d8cc483a141c3c95fea4ed35d290899c171712987e8a8b55b0de5097b86 not found: ID does not exist" Nov 29 14:32:11 crc kubenswrapper[4907]: I1129 14:32:11.136566 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jg22"] Nov 29 14:32:11 crc kubenswrapper[4907]: I1129 14:32:11.453683 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5jg22" podUID="4e855aa2-07b3-4277-b106-6ba62b287992" containerName="registry-server" containerID="cri-o://2d852bce9b42357dac90b956f7c3cc869e9da1226a0419b99c652373f60c8b9e" gracePeriod=2 Nov 29 14:32:12 crc kubenswrapper[4907]: I1129 14:32:12.488321 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a15057-e639-4f44-970d-2b439ed484e1" path="/var/lib/kubelet/pods/f1a15057-e639-4f44-970d-2b439ed484e1/volumes" Nov 29 14:32:13 crc kubenswrapper[4907]: I1129 14:32:13.471105 4907 generic.go:334] "Generic (PLEG): container finished" podID="4e855aa2-07b3-4277-b106-6ba62b287992" containerID="2d852bce9b42357dac90b956f7c3cc869e9da1226a0419b99c652373f60c8b9e" exitCode=0 Nov 29 14:32:13 crc kubenswrapper[4907]: I1129 14:32:13.471579 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jg22" event={"ID":"4e855aa2-07b3-4277-b106-6ba62b287992","Type":"ContainerDied","Data":"2d852bce9b42357dac90b956f7c3cc869e9da1226a0419b99c652373f60c8b9e"} Nov 29 14:32:13 crc kubenswrapper[4907]: I1129 14:32:13.522518 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cflt7"] Nov 29 14:32:13 crc kubenswrapper[4907]: I1129 14:32:13.522738 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cflt7" podUID="03ea4590-5449-4b54-97f5-e09dbc538dfe" containerName="registry-server" containerID="cri-o://b846ee36c9aa4a39aef68dd882d90a8a9687df90aa4c908220a12b8da2abc08f" gracePeriod=2 Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.222643 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.321599 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e855aa2-07b3-4277-b106-6ba62b287992-catalog-content\") pod \"4e855aa2-07b3-4277-b106-6ba62b287992\" (UID: \"4e855aa2-07b3-4277-b106-6ba62b287992\") " Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.321772 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e855aa2-07b3-4277-b106-6ba62b287992-utilities\") pod \"4e855aa2-07b3-4277-b106-6ba62b287992\" (UID: \"4e855aa2-07b3-4277-b106-6ba62b287992\") " Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.321832 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlpbt\" (UniqueName: \"kubernetes.io/projected/4e855aa2-07b3-4277-b106-6ba62b287992-kube-api-access-wlpbt\") pod \"4e855aa2-07b3-4277-b106-6ba62b287992\" (UID: \"4e855aa2-07b3-4277-b106-6ba62b287992\") " Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.324636 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e855aa2-07b3-4277-b106-6ba62b287992-utilities" (OuterVolumeSpecName: "utilities") pod "4e855aa2-07b3-4277-b106-6ba62b287992" (UID: "4e855aa2-07b3-4277-b106-6ba62b287992"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.332108 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e855aa2-07b3-4277-b106-6ba62b287992-kube-api-access-wlpbt" (OuterVolumeSpecName: "kube-api-access-wlpbt") pod "4e855aa2-07b3-4277-b106-6ba62b287992" (UID: "4e855aa2-07b3-4277-b106-6ba62b287992"). InnerVolumeSpecName "kube-api-access-wlpbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.342452 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e855aa2-07b3-4277-b106-6ba62b287992-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e855aa2-07b3-4277-b106-6ba62b287992" (UID: "4e855aa2-07b3-4277-b106-6ba62b287992"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.374982 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.423949 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03ea4590-5449-4b54-97f5-e09dbc538dfe-catalog-content\") pod \"03ea4590-5449-4b54-97f5-e09dbc538dfe\" (UID: \"03ea4590-5449-4b54-97f5-e09dbc538dfe\") " Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.424016 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mqcr\" (UniqueName: \"kubernetes.io/projected/03ea4590-5449-4b54-97f5-e09dbc538dfe-kube-api-access-5mqcr\") pod \"03ea4590-5449-4b54-97f5-e09dbc538dfe\" (UID: \"03ea4590-5449-4b54-97f5-e09dbc538dfe\") " Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.424097 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03ea4590-5449-4b54-97f5-e09dbc538dfe-utilities\") pod \"03ea4590-5449-4b54-97f5-e09dbc538dfe\" (UID: \"03ea4590-5449-4b54-97f5-e09dbc538dfe\") " Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.424511 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e855aa2-07b3-4277-b106-6ba62b287992-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.424536 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e855aa2-07b3-4277-b106-6ba62b287992-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.424551 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlpbt\" (UniqueName: \"kubernetes.io/projected/4e855aa2-07b3-4277-b106-6ba62b287992-kube-api-access-wlpbt\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.425531 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03ea4590-5449-4b54-97f5-e09dbc538dfe-utilities" (OuterVolumeSpecName: "utilities") pod "03ea4590-5449-4b54-97f5-e09dbc538dfe" (UID: "03ea4590-5449-4b54-97f5-e09dbc538dfe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.429115 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03ea4590-5449-4b54-97f5-e09dbc538dfe-kube-api-access-5mqcr" (OuterVolumeSpecName: "kube-api-access-5mqcr") pod "03ea4590-5449-4b54-97f5-e09dbc538dfe" (UID: "03ea4590-5449-4b54-97f5-e09dbc538dfe"). InnerVolumeSpecName "kube-api-access-5mqcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.479692 4907 generic.go:334] "Generic (PLEG): container finished" podID="03ea4590-5449-4b54-97f5-e09dbc538dfe" containerID="b846ee36c9aa4a39aef68dd882d90a8a9687df90aa4c908220a12b8da2abc08f" exitCode=0 Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.479820 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cflt7" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.482400 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jg22" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.493247 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cflt7" event={"ID":"03ea4590-5449-4b54-97f5-e09dbc538dfe","Type":"ContainerDied","Data":"b846ee36c9aa4a39aef68dd882d90a8a9687df90aa4c908220a12b8da2abc08f"} Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.493288 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cflt7" event={"ID":"03ea4590-5449-4b54-97f5-e09dbc538dfe","Type":"ContainerDied","Data":"58ae8e42b2b417d2be5aedd977b9a86439654d7795005d54c305dd45eb54f8ea"} Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.493301 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jg22" event={"ID":"4e855aa2-07b3-4277-b106-6ba62b287992","Type":"ContainerDied","Data":"ac090139c909f70422d905693cc7f029f7b6e00d1226ce80fc68d5c3230f7e1c"} Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.493320 4907 scope.go:117] "RemoveContainer" containerID="b846ee36c9aa4a39aef68dd882d90a8a9687df90aa4c908220a12b8da2abc08f" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.515251 4907 scope.go:117] "RemoveContainer" containerID="b370737ede49ad435e5c3e11968e7ab65d1fba6b794aa189d51b374cadf1d3f4" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.529921 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mqcr\" (UniqueName: \"kubernetes.io/projected/03ea4590-5449-4b54-97f5-e09dbc538dfe-kube-api-access-5mqcr\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.529962 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03ea4590-5449-4b54-97f5-e09dbc538dfe-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.541387 4907 scope.go:117] "RemoveContainer" containerID="9cbe195dcc731903eccdd8b18a4dd0803f41e6d21680885b72a09ac0efddeb33" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.555014 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jg22"] Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.564204 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jg22"] Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.566815 4907 scope.go:117] "RemoveContainer" containerID="b846ee36c9aa4a39aef68dd882d90a8a9687df90aa4c908220a12b8da2abc08f" Nov 29 14:32:14 crc kubenswrapper[4907]: E1129 14:32:14.568430 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b846ee36c9aa4a39aef68dd882d90a8a9687df90aa4c908220a12b8da2abc08f\": container with ID starting with b846ee36c9aa4a39aef68dd882d90a8a9687df90aa4c908220a12b8da2abc08f not found: ID does not exist" containerID="b846ee36c9aa4a39aef68dd882d90a8a9687df90aa4c908220a12b8da2abc08f" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.568482 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b846ee36c9aa4a39aef68dd882d90a8a9687df90aa4c908220a12b8da2abc08f"} err="failed to get container status \"b846ee36c9aa4a39aef68dd882d90a8a9687df90aa4c908220a12b8da2abc08f\": rpc error: code = NotFound desc = could not find container \"b846ee36c9aa4a39aef68dd882d90a8a9687df90aa4c908220a12b8da2abc08f\": container with ID starting with b846ee36c9aa4a39aef68dd882d90a8a9687df90aa4c908220a12b8da2abc08f not found: ID does not exist" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.568504 4907 scope.go:117] "RemoveContainer" containerID="b370737ede49ad435e5c3e11968e7ab65d1fba6b794aa189d51b374cadf1d3f4" Nov 29 14:32:14 crc kubenswrapper[4907]: E1129 14:32:14.569703 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b370737ede49ad435e5c3e11968e7ab65d1fba6b794aa189d51b374cadf1d3f4\": container with ID starting with b370737ede49ad435e5c3e11968e7ab65d1fba6b794aa189d51b374cadf1d3f4 not found: ID does not exist" containerID="b370737ede49ad435e5c3e11968e7ab65d1fba6b794aa189d51b374cadf1d3f4" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.569726 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b370737ede49ad435e5c3e11968e7ab65d1fba6b794aa189d51b374cadf1d3f4"} err="failed to get container status \"b370737ede49ad435e5c3e11968e7ab65d1fba6b794aa189d51b374cadf1d3f4\": rpc error: code = NotFound desc = could not find container \"b370737ede49ad435e5c3e11968e7ab65d1fba6b794aa189d51b374cadf1d3f4\": container with ID starting with b370737ede49ad435e5c3e11968e7ab65d1fba6b794aa189d51b374cadf1d3f4 not found: ID does not exist" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.569765 4907 scope.go:117] "RemoveContainer" containerID="9cbe195dcc731903eccdd8b18a4dd0803f41e6d21680885b72a09ac0efddeb33" Nov 29 14:32:14 crc kubenswrapper[4907]: E1129 14:32:14.571516 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cbe195dcc731903eccdd8b18a4dd0803f41e6d21680885b72a09ac0efddeb33\": container with ID starting with 9cbe195dcc731903eccdd8b18a4dd0803f41e6d21680885b72a09ac0efddeb33 not found: ID does not exist" containerID="9cbe195dcc731903eccdd8b18a4dd0803f41e6d21680885b72a09ac0efddeb33" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.571545 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cbe195dcc731903eccdd8b18a4dd0803f41e6d21680885b72a09ac0efddeb33"} err="failed to get container status \"9cbe195dcc731903eccdd8b18a4dd0803f41e6d21680885b72a09ac0efddeb33\": rpc error: code = NotFound desc = could not find container \"9cbe195dcc731903eccdd8b18a4dd0803f41e6d21680885b72a09ac0efddeb33\": container with ID starting with 9cbe195dcc731903eccdd8b18a4dd0803f41e6d21680885b72a09ac0efddeb33 not found: ID does not exist" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.571567 4907 scope.go:117] "RemoveContainer" containerID="2d852bce9b42357dac90b956f7c3cc869e9da1226a0419b99c652373f60c8b9e" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.585532 4907 scope.go:117] "RemoveContainer" containerID="0d0061ecfe623d4094ef436921d73e91d18e65f2047a290305347c7dabc0862f" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.603028 4907 scope.go:117] "RemoveContainer" containerID="99eb21ebcdbcf4ff05720457a360eb596e4f8845510302040d85ed447d735a7c" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.674262 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03ea4590-5449-4b54-97f5-e09dbc538dfe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03ea4590-5449-4b54-97f5-e09dbc538dfe" (UID: "03ea4590-5449-4b54-97f5-e09dbc538dfe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.731240 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03ea4590-5449-4b54-97f5-e09dbc538dfe-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.814028 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cflt7"] Nov 29 14:32:14 crc kubenswrapper[4907]: I1129 14:32:14.824756 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cflt7"] Nov 29 14:32:16 crc kubenswrapper[4907]: I1129 14:32:16.489610 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03ea4590-5449-4b54-97f5-e09dbc538dfe" path="/var/lib/kubelet/pods/03ea4590-5449-4b54-97f5-e09dbc538dfe/volumes" Nov 29 14:32:16 crc kubenswrapper[4907]: I1129 14:32:16.490204 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e855aa2-07b3-4277-b106-6ba62b287992" path="/var/lib/kubelet/pods/4e855aa2-07b3-4277-b106-6ba62b287992/volumes" Nov 29 14:32:16 crc kubenswrapper[4907]: I1129 14:32:16.498015 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6z8qw" event={"ID":"4415eeb4-f833-4731-b210-7193f0b12556","Type":"ContainerStarted","Data":"6cfc065559f1c8a05ec4ddf11c1a6b0d592fa594a42c8d4eef6443d4dddec720"} Nov 29 14:32:17 crc kubenswrapper[4907]: I1129 14:32:17.505232 4907 generic.go:334] "Generic (PLEG): container finished" podID="4415eeb4-f833-4731-b210-7193f0b12556" containerID="6cfc065559f1c8a05ec4ddf11c1a6b0d592fa594a42c8d4eef6443d4dddec720" exitCode=0 Nov 29 14:32:17 crc kubenswrapper[4907]: I1129 14:32:17.505288 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6z8qw" event={"ID":"4415eeb4-f833-4731-b210-7193f0b12556","Type":"ContainerDied","Data":"6cfc065559f1c8a05ec4ddf11c1a6b0d592fa594a42c8d4eef6443d4dddec720"} Nov 29 14:32:18 crc kubenswrapper[4907]: I1129 14:32:18.526719 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6z8qw" event={"ID":"4415eeb4-f833-4731-b210-7193f0b12556","Type":"ContainerStarted","Data":"6b24f629a074ec8043b0bc7f396e79114ce0f26319317b21c55715d6610704f3"} Nov 29 14:32:18 crc kubenswrapper[4907]: I1129 14:32:18.555684 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6z8qw" podStartSLOduration=3.229941041 podStartE2EDuration="1m22.555662995s" podCreationTimestamp="2025-11-29 14:30:56 +0000 UTC" firstStartedPulling="2025-11-29 14:30:58.641597978 +0000 UTC m=+156.628435630" lastFinishedPulling="2025-11-29 14:32:17.967319932 +0000 UTC m=+235.954157584" observedRunningTime="2025-11-29 14:32:18.5519081 +0000 UTC m=+236.538745762" watchObservedRunningTime="2025-11-29 14:32:18.555662995 +0000 UTC m=+236.542500657" Nov 29 14:32:18 crc kubenswrapper[4907]: I1129 14:32:18.618320 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qdmt6"] Nov 29 14:32:26 crc kubenswrapper[4907]: I1129 14:32:26.553872 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:32:26 crc kubenswrapper[4907]: I1129 14:32:26.554524 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:32:26 crc kubenswrapper[4907]: I1129 14:32:26.606256 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:32:27 crc kubenswrapper[4907]: I1129 14:32:27.632276 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:32:27 crc kubenswrapper[4907]: I1129 14:32:27.766298 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6z8qw"] Nov 29 14:32:29 crc kubenswrapper[4907]: I1129 14:32:29.585821 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6z8qw" podUID="4415eeb4-f833-4731-b210-7193f0b12556" containerName="registry-server" containerID="cri-o://6b24f629a074ec8043b0bc7f396e79114ce0f26319317b21c55715d6610704f3" gracePeriod=2 Nov 29 14:32:30 crc kubenswrapper[4907]: I1129 14:32:30.595478 4907 generic.go:334] "Generic (PLEG): container finished" podID="4415eeb4-f833-4731-b210-7193f0b12556" containerID="6b24f629a074ec8043b0bc7f396e79114ce0f26319317b21c55715d6610704f3" exitCode=0 Nov 29 14:32:30 crc kubenswrapper[4907]: I1129 14:32:30.595551 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6z8qw" event={"ID":"4415eeb4-f833-4731-b210-7193f0b12556","Type":"ContainerDied","Data":"6b24f629a074ec8043b0bc7f396e79114ce0f26319317b21c55715d6610704f3"} Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.120298 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.231151 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ncrj\" (UniqueName: \"kubernetes.io/projected/4415eeb4-f833-4731-b210-7193f0b12556-kube-api-access-5ncrj\") pod \"4415eeb4-f833-4731-b210-7193f0b12556\" (UID: \"4415eeb4-f833-4731-b210-7193f0b12556\") " Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.231250 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4415eeb4-f833-4731-b210-7193f0b12556-catalog-content\") pod \"4415eeb4-f833-4731-b210-7193f0b12556\" (UID: \"4415eeb4-f833-4731-b210-7193f0b12556\") " Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.231299 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4415eeb4-f833-4731-b210-7193f0b12556-utilities\") pod \"4415eeb4-f833-4731-b210-7193f0b12556\" (UID: \"4415eeb4-f833-4731-b210-7193f0b12556\") " Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.232432 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4415eeb4-f833-4731-b210-7193f0b12556-utilities" (OuterVolumeSpecName: "utilities") pod "4415eeb4-f833-4731-b210-7193f0b12556" (UID: "4415eeb4-f833-4731-b210-7193f0b12556"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.239583 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4415eeb4-f833-4731-b210-7193f0b12556-kube-api-access-5ncrj" (OuterVolumeSpecName: "kube-api-access-5ncrj") pod "4415eeb4-f833-4731-b210-7193f0b12556" (UID: "4415eeb4-f833-4731-b210-7193f0b12556"). InnerVolumeSpecName "kube-api-access-5ncrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.301697 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4415eeb4-f833-4731-b210-7193f0b12556-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4415eeb4-f833-4731-b210-7193f0b12556" (UID: "4415eeb4-f833-4731-b210-7193f0b12556"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.333145 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ncrj\" (UniqueName: \"kubernetes.io/projected/4415eeb4-f833-4731-b210-7193f0b12556-kube-api-access-5ncrj\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.333222 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4415eeb4-f833-4731-b210-7193f0b12556-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.333234 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4415eeb4-f833-4731-b210-7193f0b12556-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.606178 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6z8qw" event={"ID":"4415eeb4-f833-4731-b210-7193f0b12556","Type":"ContainerDied","Data":"404e15e1bd8fa0fc022b6e79becf3b3c745d24c4ca8ae788ca953144e8eb01d1"} Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.606246 4907 scope.go:117] "RemoveContainer" containerID="6b24f629a074ec8043b0bc7f396e79114ce0f26319317b21c55715d6610704f3" Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.606309 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6z8qw" Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.632623 4907 scope.go:117] "RemoveContainer" containerID="6cfc065559f1c8a05ec4ddf11c1a6b0d592fa594a42c8d4eef6443d4dddec720" Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.660109 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6z8qw"] Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.666134 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6z8qw"] Nov 29 14:32:31 crc kubenswrapper[4907]: I1129 14:32:31.667857 4907 scope.go:117] "RemoveContainer" containerID="ca98c11a5d5284d37efda32dd383dbfdc0970c51531f335e93e8661612dfb555" Nov 29 14:32:32 crc kubenswrapper[4907]: I1129 14:32:32.492810 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4415eeb4-f833-4731-b210-7193f0b12556" path="/var/lib/kubelet/pods/4415eeb4-f833-4731-b210-7193f0b12556/volumes" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.442801 4907 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.443523 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ea4590-5449-4b54-97f5-e09dbc538dfe" containerName="extract-content" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.443545 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ea4590-5449-4b54-97f5-e09dbc538dfe" containerName="extract-content" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.443571 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a15057-e639-4f44-970d-2b439ed484e1" containerName="extract-utilities" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.443585 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a15057-e639-4f44-970d-2b439ed484e1" containerName="extract-utilities" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.443607 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ea4590-5449-4b54-97f5-e09dbc538dfe" containerName="registry-server" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.443620 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ea4590-5449-4b54-97f5-e09dbc538dfe" containerName="registry-server" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.443639 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e855aa2-07b3-4277-b106-6ba62b287992" containerName="extract-content" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.443652 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e855aa2-07b3-4277-b106-6ba62b287992" containerName="extract-content" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.443669 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e855aa2-07b3-4277-b106-6ba62b287992" containerName="registry-server" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.443683 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e855aa2-07b3-4277-b106-6ba62b287992" containerName="registry-server" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.443719 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4415eeb4-f833-4731-b210-7193f0b12556" containerName="extract-content" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.443736 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4415eeb4-f833-4731-b210-7193f0b12556" containerName="extract-content" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.443762 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4415eeb4-f833-4731-b210-7193f0b12556" containerName="extract-utilities" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.443779 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4415eeb4-f833-4731-b210-7193f0b12556" containerName="extract-utilities" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.443801 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a15057-e639-4f44-970d-2b439ed484e1" containerName="extract-content" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.443814 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a15057-e639-4f44-970d-2b439ed484e1" containerName="extract-content" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.443832 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a15057-e639-4f44-970d-2b439ed484e1" containerName="registry-server" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.443847 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a15057-e639-4f44-970d-2b439ed484e1" containerName="registry-server" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.443867 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4415eeb4-f833-4731-b210-7193f0b12556" containerName="registry-server" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.443880 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4415eeb4-f833-4731-b210-7193f0b12556" containerName="registry-server" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.443897 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e855aa2-07b3-4277-b106-6ba62b287992" containerName="extract-utilities" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.443910 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e855aa2-07b3-4277-b106-6ba62b287992" containerName="extract-utilities" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.443929 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3687dd4-0e9e-44fe-bb80-e458b97c047d" containerName="pruner" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.443942 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3687dd4-0e9e-44fe-bb80-e458b97c047d" containerName="pruner" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.443961 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ea4590-5449-4b54-97f5-e09dbc538dfe" containerName="extract-utilities" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.443976 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ea4590-5449-4b54-97f5-e09dbc538dfe" containerName="extract-utilities" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.444175 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4415eeb4-f833-4731-b210-7193f0b12556" containerName="registry-server" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.444207 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3687dd4-0e9e-44fe-bb80-e458b97c047d" containerName="pruner" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.444241 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e855aa2-07b3-4277-b106-6ba62b287992" containerName="registry-server" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.444261 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a15057-e639-4f44-970d-2b439ed484e1" containerName="registry-server" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.444284 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="03ea4590-5449-4b54-97f5-e09dbc538dfe" containerName="registry-server" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.444922 4907 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.445363 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36" gracePeriod=15 Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.445469 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.445473 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f" gracePeriod=15 Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.445534 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297" gracePeriod=15 Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.445598 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f" gracePeriod=15 Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.445621 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a" gracePeriod=15 Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.445975 4907 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.446311 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.446368 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.446390 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.446417 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.446465 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.446479 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.446499 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.446513 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.446542 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.446554 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.446573 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.446586 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.446757 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.446783 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.446801 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.446815 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.446837 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.446995 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.447009 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.447193 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.510334 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.510389 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.510423 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.510467 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.510501 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.510530 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.510571 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.510600 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.531961 4907 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.47:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.611715 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.611783 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.611826 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.611860 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.611884 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.611897 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.611952 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.611918 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.611992 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.612015 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.612020 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.612047 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.612051 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.611967 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.612074 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.612115 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.642922 4907 generic.go:334] "Generic (PLEG): container finished" podID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" containerID="cbb4913b5d8b66bd6d98668f52efb2ad865e38ea1201c0046c67e804e866eac5" exitCode=0 Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.643052 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9ee00348-f65d-4dd4-ba5d-420b395cb2a0","Type":"ContainerDied","Data":"cbb4913b5d8b66bd6d98668f52efb2ad865e38ea1201c0046c67e804e866eac5"} Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.644063 4907 status_manager.go:851] "Failed to get status for pod" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.645992 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.647774 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.649085 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297" exitCode=0 Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.649125 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f" exitCode=0 Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.649141 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f" exitCode=0 Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.649157 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a" exitCode=2 Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.649217 4907 scope.go:117] "RemoveContainer" containerID="4d2c225ef3b4151847a934dd3eb26a1cccad9a7bb360b65a14dd0b892783e0fc" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.832885 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:36 crc kubenswrapper[4907]: W1129 14:32:36.867517 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-29b6e65156f6e87ca6a5678a347b69e3b7612531ec6a52d5d790205c55f773a2 WatchSource:0}: Error finding container 29b6e65156f6e87ca6a5678a347b69e3b7612531ec6a52d5d790205c55f773a2: Status 404 returned error can't find the container with id 29b6e65156f6e87ca6a5678a347b69e3b7612531ec6a52d5d790205c55f773a2 Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.872798 4907 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.47:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187c80c97b8bff98 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-29 14:32:36.871946136 +0000 UTC m=+254.858783828,LastTimestamp:2025-11-29 14:32:36.871946136 +0000 UTC m=+254.858783828,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.967542 4907 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.968035 4907 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.968622 4907 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.968963 4907 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.969280 4907 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:36 crc kubenswrapper[4907]: I1129 14:32:36.969327 4907 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 29 14:32:36 crc kubenswrapper[4907]: E1129 14:32:36.969647 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="200ms" Nov 29 14:32:37 crc kubenswrapper[4907]: E1129 14:32:37.171049 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="400ms" Nov 29 14:32:37 crc kubenswrapper[4907]: E1129 14:32:37.572644 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="800ms" Nov 29 14:32:37 crc kubenswrapper[4907]: I1129 14:32:37.669505 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 29 14:32:37 crc kubenswrapper[4907]: I1129 14:32:37.674202 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5b06e4b9248c83708e7650eced1471f02cfe573d1043bef91b766f1da7699f7f"} Nov 29 14:32:37 crc kubenswrapper[4907]: I1129 14:32:37.674304 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"29b6e65156f6e87ca6a5678a347b69e3b7612531ec6a52d5d790205c55f773a2"} Nov 29 14:32:37 crc kubenswrapper[4907]: I1129 14:32:37.675543 4907 status_manager.go:851] "Failed to get status for pod" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:37 crc kubenswrapper[4907]: E1129 14:32:37.675609 4907 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.47:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.011827 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.012495 4907 status_manager.go:851] "Failed to get status for pod" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.130329 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-kubelet-dir\") pod \"9ee00348-f65d-4dd4-ba5d-420b395cb2a0\" (UID: \"9ee00348-f65d-4dd4-ba5d-420b395cb2a0\") " Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.130509 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-var-lock\") pod \"9ee00348-f65d-4dd4-ba5d-420b395cb2a0\" (UID: \"9ee00348-f65d-4dd4-ba5d-420b395cb2a0\") " Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.130525 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9ee00348-f65d-4dd4-ba5d-420b395cb2a0" (UID: "9ee00348-f65d-4dd4-ba5d-420b395cb2a0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.130605 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-kube-api-access\") pod \"9ee00348-f65d-4dd4-ba5d-420b395cb2a0\" (UID: \"9ee00348-f65d-4dd4-ba5d-420b395cb2a0\") " Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.130624 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-var-lock" (OuterVolumeSpecName: "var-lock") pod "9ee00348-f65d-4dd4-ba5d-420b395cb2a0" (UID: "9ee00348-f65d-4dd4-ba5d-420b395cb2a0"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.130896 4907 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-var-lock\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.130924 4907 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.137594 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9ee00348-f65d-4dd4-ba5d-420b395cb2a0" (UID: "9ee00348-f65d-4dd4-ba5d-420b395cb2a0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.232423 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ee00348-f65d-4dd4-ba5d-420b395cb2a0-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:38 crc kubenswrapper[4907]: E1129 14:32:38.373847 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="1.6s" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.680650 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9ee00348-f65d-4dd4-ba5d-420b395cb2a0","Type":"ContainerDied","Data":"7d417c7bceb67952edf6043deab632e6699b3b7ffb0addc6fcf947229397a907"} Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.680699 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d417c7bceb67952edf6043deab632e6699b3b7ffb0addc6fcf947229397a907" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.680733 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.683973 4907 status_manager.go:851] "Failed to get status for pod" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.844107 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.845343 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.845807 4907 status_manager.go:851] "Failed to get status for pod" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.845974 4907 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.950853 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.950958 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.951019 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.951033 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.951128 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.951211 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.951486 4907 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.951509 4907 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:38 crc kubenswrapper[4907]: I1129 14:32:38.951526 4907 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.694125 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.695411 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36" exitCode=0 Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.695514 4907 scope.go:117] "RemoveContainer" containerID="97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.695571 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.720602 4907 status_manager.go:851] "Failed to get status for pod" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.721176 4907 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.724549 4907 scope.go:117] "RemoveContainer" containerID="222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.754273 4907 scope.go:117] "RemoveContainer" containerID="21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.778847 4907 scope.go:117] "RemoveContainer" containerID="3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.803137 4907 scope.go:117] "RemoveContainer" containerID="2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.830606 4907 scope.go:117] "RemoveContainer" containerID="3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.864864 4907 scope.go:117] "RemoveContainer" containerID="97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297" Nov 29 14:32:39 crc kubenswrapper[4907]: E1129 14:32:39.867621 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\": container with ID starting with 97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297 not found: ID does not exist" containerID="97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.867808 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297"} err="failed to get container status \"97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\": rpc error: code = NotFound desc = could not find container \"97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297\": container with ID starting with 97b894053379fd27725499f85fa628b999547e28c06e21be959738015be3f297 not found: ID does not exist" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.867856 4907 scope.go:117] "RemoveContainer" containerID="222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f" Nov 29 14:32:39 crc kubenswrapper[4907]: E1129 14:32:39.868528 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\": container with ID starting with 222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f not found: ID does not exist" containerID="222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.868601 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f"} err="failed to get container status \"222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\": rpc error: code = NotFound desc = could not find container \"222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f\": container with ID starting with 222676e258d660a2d2794dfd8e1be0b76cd5fb69f7a9468357d5cb808163868f not found: ID does not exist" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.868644 4907 scope.go:117] "RemoveContainer" containerID="21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f" Nov 29 14:32:39 crc kubenswrapper[4907]: E1129 14:32:39.869314 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\": container with ID starting with 21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f not found: ID does not exist" containerID="21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.869366 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f"} err="failed to get container status \"21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\": rpc error: code = NotFound desc = could not find container \"21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f\": container with ID starting with 21492162006b5b3858b88a09bb67d8eec0bc6fda1d45322e4e317efc471ff20f not found: ID does not exist" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.869401 4907 scope.go:117] "RemoveContainer" containerID="3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a" Nov 29 14:32:39 crc kubenswrapper[4907]: E1129 14:32:39.869874 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\": container with ID starting with 3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a not found: ID does not exist" containerID="3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.870026 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a"} err="failed to get container status \"3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\": rpc error: code = NotFound desc = could not find container \"3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a\": container with ID starting with 3078f35048d664fef779f1c26471f153b2929306424994436f133131063eb23a not found: ID does not exist" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.870098 4907 scope.go:117] "RemoveContainer" containerID="2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36" Nov 29 14:32:39 crc kubenswrapper[4907]: E1129 14:32:39.871551 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\": container with ID starting with 2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36 not found: ID does not exist" containerID="2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.871616 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36"} err="failed to get container status \"2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\": rpc error: code = NotFound desc = could not find container \"2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36\": container with ID starting with 2c05ddbca6bbc9fca3235eb161608cec35ef20800cb75f9f15d6c214654c3e36 not found: ID does not exist" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.871639 4907 scope.go:117] "RemoveContainer" containerID="3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027" Nov 29 14:32:39 crc kubenswrapper[4907]: E1129 14:32:39.872017 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\": container with ID starting with 3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027 not found: ID does not exist" containerID="3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027" Nov 29 14:32:39 crc kubenswrapper[4907]: I1129 14:32:39.872122 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027"} err="failed to get container status \"3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\": rpc error: code = NotFound desc = could not find container \"3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027\": container with ID starting with 3592a62ce88362779a750306def0713af295b7d25730c2a2e2fc06c6695c5027 not found: ID does not exist" Nov 29 14:32:39 crc kubenswrapper[4907]: E1129 14:32:39.974557 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="3.2s" Nov 29 14:32:40 crc kubenswrapper[4907]: I1129 14:32:40.491299 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 29 14:32:42 crc kubenswrapper[4907]: I1129 14:32:42.484864 4907 status_manager.go:851] "Failed to get status for pod" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:43 crc kubenswrapper[4907]: E1129 14:32:43.175974 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="6.4s" Nov 29 14:32:43 crc kubenswrapper[4907]: I1129 14:32:43.651886 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" podUID="66db1fbb-f050-4af3-977b-831602348a9b" containerName="oauth-openshift" containerID="cri-o://3d44dfc3e593efd81b4bec211dee9306b242b2c68790daa839dbfc30db6e2475" gracePeriod=15 Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.135125 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.136481 4907 status_manager.go:851] "Failed to get status for pod" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.137059 4907 status_manager.go:851] "Failed to get status for pod" podUID="66db1fbb-f050-4af3-977b-831602348a9b" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-qdmt6\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.226300 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-serving-cert\") pod \"66db1fbb-f050-4af3-977b-831602348a9b\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.226376 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-session\") pod \"66db1fbb-f050-4af3-977b-831602348a9b\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.226415 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnl6v\" (UniqueName: \"kubernetes.io/projected/66db1fbb-f050-4af3-977b-831602348a9b-kube-api-access-tnl6v\") pod \"66db1fbb-f050-4af3-977b-831602348a9b\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.226471 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66db1fbb-f050-4af3-977b-831602348a9b-audit-dir\") pod \"66db1fbb-f050-4af3-977b-831602348a9b\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.226506 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-ocp-branding-template\") pod \"66db1fbb-f050-4af3-977b-831602348a9b\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.226538 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-login\") pod \"66db1fbb-f050-4af3-977b-831602348a9b\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.226571 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-trusted-ca-bundle\") pod \"66db1fbb-f050-4af3-977b-831602348a9b\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.226620 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-error\") pod \"66db1fbb-f050-4af3-977b-831602348a9b\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.226674 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-audit-policies\") pod \"66db1fbb-f050-4af3-977b-831602348a9b\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.226802 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-idp-0-file-data\") pod \"66db1fbb-f050-4af3-977b-831602348a9b\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.226811 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66db1fbb-f050-4af3-977b-831602348a9b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "66db1fbb-f050-4af3-977b-831602348a9b" (UID: "66db1fbb-f050-4af3-977b-831602348a9b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.226856 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-cliconfig\") pod \"66db1fbb-f050-4af3-977b-831602348a9b\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.226914 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-service-ca\") pod \"66db1fbb-f050-4af3-977b-831602348a9b\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.226971 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-router-certs\") pod \"66db1fbb-f050-4af3-977b-831602348a9b\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.227042 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-provider-selection\") pod \"66db1fbb-f050-4af3-977b-831602348a9b\" (UID: \"66db1fbb-f050-4af3-977b-831602348a9b\") " Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.227373 4907 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66db1fbb-f050-4af3-977b-831602348a9b-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.228116 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "66db1fbb-f050-4af3-977b-831602348a9b" (UID: "66db1fbb-f050-4af3-977b-831602348a9b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.228308 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "66db1fbb-f050-4af3-977b-831602348a9b" (UID: "66db1fbb-f050-4af3-977b-831602348a9b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.228394 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "66db1fbb-f050-4af3-977b-831602348a9b" (UID: "66db1fbb-f050-4af3-977b-831602348a9b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.229922 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "66db1fbb-f050-4af3-977b-831602348a9b" (UID: "66db1fbb-f050-4af3-977b-831602348a9b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.233669 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66db1fbb-f050-4af3-977b-831602348a9b-kube-api-access-tnl6v" (OuterVolumeSpecName: "kube-api-access-tnl6v") pod "66db1fbb-f050-4af3-977b-831602348a9b" (UID: "66db1fbb-f050-4af3-977b-831602348a9b"). InnerVolumeSpecName "kube-api-access-tnl6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.234830 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "66db1fbb-f050-4af3-977b-831602348a9b" (UID: "66db1fbb-f050-4af3-977b-831602348a9b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.235182 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "66db1fbb-f050-4af3-977b-831602348a9b" (UID: "66db1fbb-f050-4af3-977b-831602348a9b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.235652 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "66db1fbb-f050-4af3-977b-831602348a9b" (UID: "66db1fbb-f050-4af3-977b-831602348a9b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.236284 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "66db1fbb-f050-4af3-977b-831602348a9b" (UID: "66db1fbb-f050-4af3-977b-831602348a9b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.237184 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "66db1fbb-f050-4af3-977b-831602348a9b" (UID: "66db1fbb-f050-4af3-977b-831602348a9b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.237413 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "66db1fbb-f050-4af3-977b-831602348a9b" (UID: "66db1fbb-f050-4af3-977b-831602348a9b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.237811 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "66db1fbb-f050-4af3-977b-831602348a9b" (UID: "66db1fbb-f050-4af3-977b-831602348a9b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.238780 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "66db1fbb-f050-4af3-977b-831602348a9b" (UID: "66db1fbb-f050-4af3-977b-831602348a9b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.329606 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnl6v\" (UniqueName: \"kubernetes.io/projected/66db1fbb-f050-4af3-977b-831602348a9b-kube-api-access-tnl6v\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.329660 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.329683 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.329703 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.329724 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.329744 4907 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.329765 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.329787 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.329806 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.329826 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.329847 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.329869 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.329888 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/66db1fbb-f050-4af3-977b-831602348a9b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.742331 4907 generic.go:334] "Generic (PLEG): container finished" podID="66db1fbb-f050-4af3-977b-831602348a9b" containerID="3d44dfc3e593efd81b4bec211dee9306b242b2c68790daa839dbfc30db6e2475" exitCode=0 Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.742545 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.742531 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" event={"ID":"66db1fbb-f050-4af3-977b-831602348a9b","Type":"ContainerDied","Data":"3d44dfc3e593efd81b4bec211dee9306b242b2c68790daa839dbfc30db6e2475"} Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.742654 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" event={"ID":"66db1fbb-f050-4af3-977b-831602348a9b","Type":"ContainerDied","Data":"1c35c8bf13475d15e931ea366aa012ad3edcb35af19397bd34067a7402d00d53"} Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.742698 4907 scope.go:117] "RemoveContainer" containerID="3d44dfc3e593efd81b4bec211dee9306b242b2c68790daa839dbfc30db6e2475" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.747182 4907 status_manager.go:851] "Failed to get status for pod" podUID="66db1fbb-f050-4af3-977b-831602348a9b" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-qdmt6\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.749193 4907 status_manager.go:851] "Failed to get status for pod" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.750323 4907 status_manager.go:851] "Failed to get status for pod" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.752595 4907 status_manager.go:851] "Failed to get status for pod" podUID="66db1fbb-f050-4af3-977b-831602348a9b" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-qdmt6\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.776200 4907 scope.go:117] "RemoveContainer" containerID="3d44dfc3e593efd81b4bec211dee9306b242b2c68790daa839dbfc30db6e2475" Nov 29 14:32:44 crc kubenswrapper[4907]: E1129 14:32:44.776873 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d44dfc3e593efd81b4bec211dee9306b242b2c68790daa839dbfc30db6e2475\": container with ID starting with 3d44dfc3e593efd81b4bec211dee9306b242b2c68790daa839dbfc30db6e2475 not found: ID does not exist" containerID="3d44dfc3e593efd81b4bec211dee9306b242b2c68790daa839dbfc30db6e2475" Nov 29 14:32:44 crc kubenswrapper[4907]: I1129 14:32:44.776929 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d44dfc3e593efd81b4bec211dee9306b242b2c68790daa839dbfc30db6e2475"} err="failed to get container status \"3d44dfc3e593efd81b4bec211dee9306b242b2c68790daa839dbfc30db6e2475\": rpc error: code = NotFound desc = could not find container \"3d44dfc3e593efd81b4bec211dee9306b242b2c68790daa839dbfc30db6e2475\": container with ID starting with 3d44dfc3e593efd81b4bec211dee9306b242b2c68790daa839dbfc30db6e2475 not found: ID does not exist" Nov 29 14:32:45 crc kubenswrapper[4907]: E1129 14:32:45.866824 4907 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.47:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187c80c97b8bff98 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-29 14:32:36.871946136 +0000 UTC m=+254.858783828,LastTimestamp:2025-11-29 14:32:36.871946136 +0000 UTC m=+254.858783828,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 29 14:32:49 crc kubenswrapper[4907]: E1129 14:32:49.089023 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:32:49Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:32:49Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:32:49Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T14:32:49Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:49 crc kubenswrapper[4907]: E1129 14:32:49.090086 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:49 crc kubenswrapper[4907]: E1129 14:32:49.090552 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:49 crc kubenswrapper[4907]: E1129 14:32:49.090988 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:49 crc kubenswrapper[4907]: E1129 14:32:49.091267 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:49 crc kubenswrapper[4907]: E1129 14:32:49.091291 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 14:32:49 crc kubenswrapper[4907]: E1129 14:32:49.577106 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.47:6443: connect: connection refused" interval="7s" Nov 29 14:32:51 crc kubenswrapper[4907]: I1129 14:32:51.479546 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:51 crc kubenswrapper[4907]: I1129 14:32:51.480793 4907 status_manager.go:851] "Failed to get status for pod" podUID="66db1fbb-f050-4af3-977b-831602348a9b" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-qdmt6\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:51 crc kubenswrapper[4907]: I1129 14:32:51.481304 4907 status_manager.go:851] "Failed to get status for pod" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:51 crc kubenswrapper[4907]: I1129 14:32:51.505721 4907 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="76002832-0954-42e0-85c2-fec6eef37411" Nov 29 14:32:51 crc kubenswrapper[4907]: I1129 14:32:51.505776 4907 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="76002832-0954-42e0-85c2-fec6eef37411" Nov 29 14:32:51 crc kubenswrapper[4907]: E1129 14:32:51.506344 4907 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:51 crc kubenswrapper[4907]: I1129 14:32:51.507224 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:51 crc kubenswrapper[4907]: I1129 14:32:51.794699 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b9b4e7379d28ded248cb55345b3e3e8fed10c1221d7a24172693753d26121148"} Nov 29 14:32:51 crc kubenswrapper[4907]: I1129 14:32:51.800567 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 29 14:32:51 crc kubenswrapper[4907]: I1129 14:32:51.800665 4907 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192" exitCode=1 Nov 29 14:32:51 crc kubenswrapper[4907]: I1129 14:32:51.800726 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192"} Nov 29 14:32:51 crc kubenswrapper[4907]: I1129 14:32:51.801678 4907 scope.go:117] "RemoveContainer" containerID="a8b94ac8ce9058bc9e5f9790ad1cb482f4ae7aff8278768078b631bd6f237192" Nov 29 14:32:51 crc kubenswrapper[4907]: I1129 14:32:51.803405 4907 status_manager.go:851] "Failed to get status for pod" podUID="66db1fbb-f050-4af3-977b-831602348a9b" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-qdmt6\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:51 crc kubenswrapper[4907]: I1129 14:32:51.803968 4907 status_manager.go:851] "Failed to get status for pod" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:51 crc kubenswrapper[4907]: I1129 14:32:51.804531 4907 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:51 crc kubenswrapper[4907]: I1129 14:32:51.885735 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.490564 4907 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.491667 4907 status_manager.go:851] "Failed to get status for pod" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.492305 4907 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.492952 4907 status_manager.go:851] "Failed to get status for pod" podUID="66db1fbb-f050-4af3-977b-831602348a9b" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-qdmt6\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.813164 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.813302 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"217f5e268a98735710e5acd868ce8e7596d941ac61a6fad4fda1123933b963ec"} Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.814168 4907 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.814743 4907 status_manager.go:851] "Failed to get status for pod" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.815327 4907 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.815809 4907 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="210dfaa0d1da351e3fb2858308290dea78944ce222ca61c488364cbab3a68b31" exitCode=0 Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.815869 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"210dfaa0d1da351e3fb2858308290dea78944ce222ca61c488364cbab3a68b31"} Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.815872 4907 status_manager.go:851] "Failed to get status for pod" podUID="66db1fbb-f050-4af3-977b-831602348a9b" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-qdmt6\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.816117 4907 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="76002832-0954-42e0-85c2-fec6eef37411" Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.816144 4907 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="76002832-0954-42e0-85c2-fec6eef37411" Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.816551 4907 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:52 crc kubenswrapper[4907]: E1129 14:32:52.816621 4907 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.816980 4907 status_manager.go:851] "Failed to get status for pod" podUID="66db1fbb-f050-4af3-977b-831602348a9b" pod="openshift-authentication/oauth-openshift-558db77b4-qdmt6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-qdmt6\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.817475 4907 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:52 crc kubenswrapper[4907]: I1129 14:32:52.817935 4907 status_manager.go:851] "Failed to get status for pod" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.47:6443: connect: connection refused" Nov 29 14:32:53 crc kubenswrapper[4907]: I1129 14:32:53.827989 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c1a90a40bd2be0d9af76a529fc86d7d04f58db985f17c5561c8aa988213fde1c"} Nov 29 14:32:53 crc kubenswrapper[4907]: I1129 14:32:53.828485 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"789ae8f8557c366ef0717795fe963c0a1859c21102eb16270db15ddbd1548eaa"} Nov 29 14:32:53 crc kubenswrapper[4907]: I1129 14:32:53.828500 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e73c867f9bf8a810d56d7298ff1cb4a16d69fbcb98dc1359fd2a4f3d3bbbf560"} Nov 29 14:32:54 crc kubenswrapper[4907]: I1129 14:32:54.836137 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1bddb6d61733ff0faa3e1e8c39131f9e054efb1f6493ac3e8908e37b8120b2f7"} Nov 29 14:32:54 crc kubenswrapper[4907]: I1129 14:32:54.836631 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fb30c1b51e3fd24dc05a077026fa8ad1874b4a70ef93e773972cc76898a0af96"} Nov 29 14:32:54 crc kubenswrapper[4907]: I1129 14:32:54.836764 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:54 crc kubenswrapper[4907]: I1129 14:32:54.836829 4907 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="76002832-0954-42e0-85c2-fec6eef37411" Nov 29 14:32:54 crc kubenswrapper[4907]: I1129 14:32:54.836856 4907 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="76002832-0954-42e0-85c2-fec6eef37411" Nov 29 14:32:56 crc kubenswrapper[4907]: I1129 14:32:56.507859 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:56 crc kubenswrapper[4907]: I1129 14:32:56.507931 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:56 crc kubenswrapper[4907]: I1129 14:32:56.516510 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:59 crc kubenswrapper[4907]: I1129 14:32:59.850293 4907 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:32:59 crc kubenswrapper[4907]: I1129 14:32:59.948115 4907 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="df915ceb-c89e-4d03-afa5-32224242cb74" Nov 29 14:33:00 crc kubenswrapper[4907]: I1129 14:33:00.080526 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:33:00 crc kubenswrapper[4907]: I1129 14:33:00.084145 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:33:00 crc kubenswrapper[4907]: I1129 14:33:00.875930 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:33:00 crc kubenswrapper[4907]: I1129 14:33:00.877818 4907 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="76002832-0954-42e0-85c2-fec6eef37411" Nov 29 14:33:00 crc kubenswrapper[4907]: I1129 14:33:00.877866 4907 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="76002832-0954-42e0-85c2-fec6eef37411" Nov 29 14:33:00 crc kubenswrapper[4907]: I1129 14:33:00.881931 4907 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="df915ceb-c89e-4d03-afa5-32224242cb74" Nov 29 14:33:00 crc kubenswrapper[4907]: I1129 14:33:00.886145 4907 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://e73c867f9bf8a810d56d7298ff1cb4a16d69fbcb98dc1359fd2a4f3d3bbbf560" Nov 29 14:33:00 crc kubenswrapper[4907]: I1129 14:33:00.886184 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:33:01 crc kubenswrapper[4907]: I1129 14:33:01.880746 4907 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="76002832-0954-42e0-85c2-fec6eef37411" Nov 29 14:33:01 crc kubenswrapper[4907]: I1129 14:33:01.880787 4907 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="76002832-0954-42e0-85c2-fec6eef37411" Nov 29 14:33:01 crc kubenswrapper[4907]: I1129 14:33:01.885286 4907 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="df915ceb-c89e-4d03-afa5-32224242cb74" Nov 29 14:33:09 crc kubenswrapper[4907]: I1129 14:33:09.897756 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 29 14:33:09 crc kubenswrapper[4907]: I1129 14:33:09.957894 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 29 14:33:10 crc kubenswrapper[4907]: I1129 14:33:10.054202 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 29 14:33:10 crc kubenswrapper[4907]: I1129 14:33:10.075414 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 29 14:33:10 crc kubenswrapper[4907]: I1129 14:33:10.297297 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 29 14:33:10 crc kubenswrapper[4907]: I1129 14:33:10.593178 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 29 14:33:10 crc kubenswrapper[4907]: I1129 14:33:10.706424 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 29 14:33:10 crc kubenswrapper[4907]: I1129 14:33:10.853817 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 29 14:33:10 crc kubenswrapper[4907]: I1129 14:33:10.875112 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 29 14:33:11 crc kubenswrapper[4907]: I1129 14:33:11.079970 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 29 14:33:11 crc kubenswrapper[4907]: I1129 14:33:11.100294 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 29 14:33:11 crc kubenswrapper[4907]: I1129 14:33:11.530519 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 29 14:33:11 crc kubenswrapper[4907]: I1129 14:33:11.643389 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 29 14:33:11 crc kubenswrapper[4907]: I1129 14:33:11.711136 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 29 14:33:11 crc kubenswrapper[4907]: I1129 14:33:11.891926 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 14:33:11 crc kubenswrapper[4907]: I1129 14:33:11.985885 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 29 14:33:11 crc kubenswrapper[4907]: I1129 14:33:11.992544 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 29 14:33:12 crc kubenswrapper[4907]: I1129 14:33:12.103932 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 29 14:33:12 crc kubenswrapper[4907]: I1129 14:33:12.410546 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 29 14:33:12 crc kubenswrapper[4907]: I1129 14:33:12.683395 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 29 14:33:12 crc kubenswrapper[4907]: I1129 14:33:12.751403 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 29 14:33:12 crc kubenswrapper[4907]: I1129 14:33:12.753517 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 29 14:33:12 crc kubenswrapper[4907]: I1129 14:33:12.765163 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 29 14:33:12 crc kubenswrapper[4907]: I1129 14:33:12.978796 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 29 14:33:13 crc kubenswrapper[4907]: I1129 14:33:13.111943 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 29 14:33:13 crc kubenswrapper[4907]: I1129 14:33:13.329403 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 29 14:33:13 crc kubenswrapper[4907]: I1129 14:33:13.335095 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 29 14:33:13 crc kubenswrapper[4907]: I1129 14:33:13.341754 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 29 14:33:13 crc kubenswrapper[4907]: I1129 14:33:13.493010 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 29 14:33:13 crc kubenswrapper[4907]: I1129 14:33:13.511231 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 29 14:33:13 crc kubenswrapper[4907]: I1129 14:33:13.598474 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 29 14:33:13 crc kubenswrapper[4907]: I1129 14:33:13.648428 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 29 14:33:13 crc kubenswrapper[4907]: I1129 14:33:13.694868 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 29 14:33:13 crc kubenswrapper[4907]: I1129 14:33:13.773221 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 29 14:33:13 crc kubenswrapper[4907]: I1129 14:33:13.965210 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 29 14:33:13 crc kubenswrapper[4907]: I1129 14:33:13.991319 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.008202 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.082925 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.090967 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.102034 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.108188 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.147320 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.186649 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.280854 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.481264 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.577718 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.623415 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.745206 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.752190 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.877491 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.917062 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 29 14:33:14 crc kubenswrapper[4907]: I1129 14:33:14.970190 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.029697 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.157580 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.213155 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.216292 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.227410 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.284992 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.368740 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.401821 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.483057 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.507843 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.569332 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.638812 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.655609 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.700902 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.753296 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.775870 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.778342 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.809066 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 29 14:33:15 crc kubenswrapper[4907]: I1129 14:33:15.979420 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 29 14:33:16 crc kubenswrapper[4907]: I1129 14:33:16.064728 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 29 14:33:16 crc kubenswrapper[4907]: I1129 14:33:16.076006 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 29 14:33:16 crc kubenswrapper[4907]: I1129 14:33:16.246952 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 29 14:33:16 crc kubenswrapper[4907]: I1129 14:33:16.312695 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 29 14:33:16 crc kubenswrapper[4907]: I1129 14:33:16.323342 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 29 14:33:16 crc kubenswrapper[4907]: I1129 14:33:16.425002 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 29 14:33:16 crc kubenswrapper[4907]: I1129 14:33:16.485491 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 29 14:33:16 crc kubenswrapper[4907]: I1129 14:33:16.550660 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 29 14:33:16 crc kubenswrapper[4907]: I1129 14:33:16.584728 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 29 14:33:16 crc kubenswrapper[4907]: I1129 14:33:16.596024 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 29 14:33:16 crc kubenswrapper[4907]: I1129 14:33:16.684679 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 29 14:33:16 crc kubenswrapper[4907]: I1129 14:33:16.754423 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 29 14:33:16 crc kubenswrapper[4907]: I1129 14:33:16.857937 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 29 14:33:16 crc kubenswrapper[4907]: I1129 14:33:16.863332 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 29 14:33:16 crc kubenswrapper[4907]: I1129 14:33:16.950078 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 29 14:33:17 crc kubenswrapper[4907]: I1129 14:33:17.082332 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 29 14:33:17 crc kubenswrapper[4907]: I1129 14:33:17.093834 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 29 14:33:17 crc kubenswrapper[4907]: I1129 14:33:17.161558 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 29 14:33:17 crc kubenswrapper[4907]: I1129 14:33:17.174145 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 29 14:33:17 crc kubenswrapper[4907]: I1129 14:33:17.225379 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 29 14:33:17 crc kubenswrapper[4907]: I1129 14:33:17.239789 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 29 14:33:17 crc kubenswrapper[4907]: I1129 14:33:17.376612 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 29 14:33:17 crc kubenswrapper[4907]: I1129 14:33:17.393915 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 29 14:33:17 crc kubenswrapper[4907]: I1129 14:33:17.477341 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 29 14:33:17 crc kubenswrapper[4907]: I1129 14:33:17.526345 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 29 14:33:17 crc kubenswrapper[4907]: I1129 14:33:17.739185 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 29 14:33:17 crc kubenswrapper[4907]: I1129 14:33:17.767638 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 29 14:33:17 crc kubenswrapper[4907]: I1129 14:33:17.799943 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 29 14:33:17 crc kubenswrapper[4907]: I1129 14:33:17.983806 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.000743 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.014224 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.040628 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.152598 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.184208 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.250707 4907 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.286652 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.296418 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.324946 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.335146 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.353946 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.360658 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.370175 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.465127 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.483626 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.520291 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.526552 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.610614 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.679532 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.689323 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.722581 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.732221 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.752190 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.768001 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.950562 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 29 14:33:18 crc kubenswrapper[4907]: I1129 14:33:18.955122 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.052889 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.084055 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.132919 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.208224 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.294192 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.386948 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.484679 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.549145 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.575550 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.633927 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.656363 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.696582 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.732417 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.800220 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.804332 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.908212 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.967277 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 29 14:33:19 crc kubenswrapper[4907]: I1129 14:33:19.984656 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.108178 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.117241 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.172974 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.230411 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.259121 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.266040 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.288401 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.317462 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.322918 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.331722 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.380148 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.409188 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.455416 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.526916 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.632636 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.680003 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.690274 4907 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.703060 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qdmt6","openshift-kube-apiserver/kube-apiserver-crc"] Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.703261 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7874f76df5-4xxts","openshift-kube-apiserver/kube-apiserver-crc"] Nov 29 14:33:20 crc kubenswrapper[4907]: E1129 14:33:20.703872 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" containerName="installer" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.703961 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" containerName="installer" Nov 29 14:33:20 crc kubenswrapper[4907]: E1129 14:33:20.703991 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66db1fbb-f050-4af3-977b-831602348a9b" containerName="oauth-openshift" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.704062 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="66db1fbb-f050-4af3-977b-831602348a9b" containerName="oauth-openshift" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.704573 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee00348-f65d-4dd4-ba5d-420b395cb2a0" containerName="installer" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.704664 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="66db1fbb-f050-4af3-977b-831602348a9b" containerName="oauth-openshift" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.704862 4907 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="76002832-0954-42e0-85c2-fec6eef37411" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.704910 4907 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="76002832-0954-42e0-85c2-fec6eef37411" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.706930 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.712435 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.712933 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.713138 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.712966 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.713041 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.713080 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.715499 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.715724 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.716128 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.716402 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.716530 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.718890 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.720022 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.730239 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.749100 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.749914 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.751728 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.754528 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.754499133 podStartE2EDuration="21.754499133s" podCreationTimestamp="2025-11-29 14:32:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:33:20.746173246 +0000 UTC m=+298.733010928" watchObservedRunningTime="2025-11-29 14:33:20.754499133 +0000 UTC m=+298.741336825" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.789825 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.789899 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.789949 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-router-certs\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.790006 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.790183 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.790259 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-login\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.790338 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-audit-policies\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.790386 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.790523 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-audit-dir\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.790567 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-session\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.790648 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.790693 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b926f\" (UniqueName: \"kubernetes.io/projected/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-kube-api-access-b926f\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.790877 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-service-ca\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.790951 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-error\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.889614 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.891734 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-session\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.891808 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.891844 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b926f\" (UniqueName: \"kubernetes.io/projected/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-kube-api-access-b926f\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.891919 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-service-ca\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.891969 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-error\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.892004 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.892038 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.892083 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-router-certs\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.892123 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.892157 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.892193 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-login\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.892227 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-audit-policies\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.892258 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.892305 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-audit-dir\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.892398 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-audit-dir\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.893128 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-service-ca\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.893260 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.893791 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-audit-policies\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.894646 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.900685 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-error\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.900901 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.901091 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.901856 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.902419 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-session\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.903105 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.903219 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-router-certs\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.909278 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-login\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.925242 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b926f\" (UniqueName: \"kubernetes.io/projected/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-kube-api-access-b926f\") pod \"oauth-openshift-7874f76df5-4xxts\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.958945 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 29 14:33:20 crc kubenswrapper[4907]: I1129 14:33:20.962056 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.046659 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.065993 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.130687 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.150841 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.192709 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.221037 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.387995 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.416857 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.419658 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.438684 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.503308 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.582913 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.642673 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.671984 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.703838 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.728212 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.812371 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.939843 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 29 14:33:21 crc kubenswrapper[4907]: I1129 14:33:21.953275 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 29 14:33:22 crc kubenswrapper[4907]: I1129 14:33:22.042287 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 29 14:33:22 crc kubenswrapper[4907]: I1129 14:33:22.259155 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 29 14:33:22 crc kubenswrapper[4907]: I1129 14:33:22.297013 4907 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Nov 29 14:33:22 crc kubenswrapper[4907]: I1129 14:33:22.343416 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 29 14:33:22 crc kubenswrapper[4907]: I1129 14:33:22.348030 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 29 14:33:22 crc kubenswrapper[4907]: I1129 14:33:22.361849 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 29 14:33:22 crc kubenswrapper[4907]: I1129 14:33:22.440026 4907 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 29 14:33:22 crc kubenswrapper[4907]: I1129 14:33:22.498388 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66db1fbb-f050-4af3-977b-831602348a9b" path="/var/lib/kubelet/pods/66db1fbb-f050-4af3-977b-831602348a9b/volumes" Nov 29 14:33:22 crc kubenswrapper[4907]: I1129 14:33:22.532153 4907 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 29 14:33:22 crc kubenswrapper[4907]: I1129 14:33:22.532551 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://5b06e4b9248c83708e7650eced1471f02cfe573d1043bef91b766f1da7699f7f" gracePeriod=5 Nov 29 14:33:22 crc kubenswrapper[4907]: I1129 14:33:22.684343 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 29 14:33:22 crc kubenswrapper[4907]: I1129 14:33:22.787003 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 29 14:33:22 crc kubenswrapper[4907]: I1129 14:33:22.802368 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.055745 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.107202 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.117089 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.118230 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.165802 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.210302 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.243374 4907 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.245367 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.318778 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.516918 4907 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.539720 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.590478 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.628874 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.762899 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.955309 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 29 14:33:23 crc kubenswrapper[4907]: I1129 14:33:23.963338 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 29 14:33:24 crc kubenswrapper[4907]: I1129 14:33:24.003410 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 29 14:33:24 crc kubenswrapper[4907]: I1129 14:33:24.044521 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 29 14:33:24 crc kubenswrapper[4907]: I1129 14:33:24.056683 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 29 14:33:24 crc kubenswrapper[4907]: I1129 14:33:24.095631 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 29 14:33:24 crc kubenswrapper[4907]: I1129 14:33:24.161011 4907 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 29 14:33:24 crc kubenswrapper[4907]: I1129 14:33:24.234374 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 29 14:33:24 crc kubenswrapper[4907]: I1129 14:33:24.243745 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7874f76df5-4xxts"] Nov 29 14:33:24 crc kubenswrapper[4907]: I1129 14:33:24.284494 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 29 14:33:24 crc kubenswrapper[4907]: I1129 14:33:24.424342 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 29 14:33:24 crc kubenswrapper[4907]: I1129 14:33:24.512714 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 29 14:33:24 crc kubenswrapper[4907]: I1129 14:33:24.535875 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 29 14:33:24 crc kubenswrapper[4907]: I1129 14:33:24.569755 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7874f76df5-4xxts"] Nov 29 14:33:24 crc kubenswrapper[4907]: I1129 14:33:24.789773 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 29 14:33:24 crc kubenswrapper[4907]: I1129 14:33:24.964146 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 29 14:33:25 crc kubenswrapper[4907]: I1129 14:33:25.041300 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" event={"ID":"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86","Type":"ContainerStarted","Data":"e97e70be3b54102a5a9eb94dcbab74c8a3a8b77e1e66a1ae11933001a68efeb6"} Nov 29 14:33:25 crc kubenswrapper[4907]: I1129 14:33:25.041344 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" event={"ID":"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86","Type":"ContainerStarted","Data":"7f07c0e3538593d917b939b3ebdcb0d93aae020ba5ea5b3a071ff0ed043ed6eb"} Nov 29 14:33:25 crc kubenswrapper[4907]: I1129 14:33:25.042484 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:25 crc kubenswrapper[4907]: I1129 14:33:25.050759 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 29 14:33:25 crc kubenswrapper[4907]: I1129 14:33:25.258155 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 29 14:33:25 crc kubenswrapper[4907]: I1129 14:33:25.312607 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 29 14:33:25 crc kubenswrapper[4907]: I1129 14:33:25.376121 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:33:25 crc kubenswrapper[4907]: I1129 14:33:25.408235 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" podStartSLOduration=67.408209013 podStartE2EDuration="1m7.408209013s" podCreationTimestamp="2025-11-29 14:32:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:33:25.07776048 +0000 UTC m=+303.064598132" watchObservedRunningTime="2025-11-29 14:33:25.408209013 +0000 UTC m=+303.395046695" Nov 29 14:33:25 crc kubenswrapper[4907]: I1129 14:33:25.461691 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 29 14:33:25 crc kubenswrapper[4907]: I1129 14:33:25.467002 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 29 14:33:26 crc kubenswrapper[4907]: I1129 14:33:26.005351 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 29 14:33:26 crc kubenswrapper[4907]: I1129 14:33:26.181699 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.685639 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.685801 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.710560 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.710615 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.710674 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.710761 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.710774 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.710819 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.710855 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.710868 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.711134 4907 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.711152 4907 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.711125 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.711164 4907 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.724075 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.812379 4907 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:27 crc kubenswrapper[4907]: I1129 14:33:27.812429 4907 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:28 crc kubenswrapper[4907]: I1129 14:33:28.063369 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 29 14:33:28 crc kubenswrapper[4907]: I1129 14:33:28.063511 4907 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="5b06e4b9248c83708e7650eced1471f02cfe573d1043bef91b766f1da7699f7f" exitCode=137 Nov 29 14:33:28 crc kubenswrapper[4907]: I1129 14:33:28.063600 4907 scope.go:117] "RemoveContainer" containerID="5b06e4b9248c83708e7650eced1471f02cfe573d1043bef91b766f1da7699f7f" Nov 29 14:33:28 crc kubenswrapper[4907]: I1129 14:33:28.063661 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 14:33:28 crc kubenswrapper[4907]: I1129 14:33:28.092373 4907 scope.go:117] "RemoveContainer" containerID="5b06e4b9248c83708e7650eced1471f02cfe573d1043bef91b766f1da7699f7f" Nov 29 14:33:28 crc kubenswrapper[4907]: E1129 14:33:28.092949 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b06e4b9248c83708e7650eced1471f02cfe573d1043bef91b766f1da7699f7f\": container with ID starting with 5b06e4b9248c83708e7650eced1471f02cfe573d1043bef91b766f1da7699f7f not found: ID does not exist" containerID="5b06e4b9248c83708e7650eced1471f02cfe573d1043bef91b766f1da7699f7f" Nov 29 14:33:28 crc kubenswrapper[4907]: I1129 14:33:28.093029 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b06e4b9248c83708e7650eced1471f02cfe573d1043bef91b766f1da7699f7f"} err="failed to get container status \"5b06e4b9248c83708e7650eced1471f02cfe573d1043bef91b766f1da7699f7f\": rpc error: code = NotFound desc = could not find container \"5b06e4b9248c83708e7650eced1471f02cfe573d1043bef91b766f1da7699f7f\": container with ID starting with 5b06e4b9248c83708e7650eced1471f02cfe573d1043bef91b766f1da7699f7f not found: ID does not exist" Nov 29 14:33:28 crc kubenswrapper[4907]: I1129 14:33:28.492349 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 29 14:33:38 crc kubenswrapper[4907]: I1129 14:33:38.930608 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 29 14:33:42 crc kubenswrapper[4907]: I1129 14:33:42.332002 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.389987 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gpnkx"] Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.390327 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" podUID="415a984e-e3ba-4e39-adb2-f79c8ed05f3f" containerName="controller-manager" containerID="cri-o://cb43dc0696c17bc0074ec14fc29159f1232bb49d52fdeba0f10652fd02c0aa9b" gracePeriod=30 Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.484810 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr"] Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.485020 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" podUID="72fe0643-91a8-459e-aec7-257e5b07ea41" containerName="route-controller-manager" containerID="cri-o://e015939c8f741020f9a67a2e83d3feb7bc827756298dddbcbd94ce4503562162" gracePeriod=30 Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.637978 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.717862 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.811637 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.841421 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-client-ca\") pod \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.841510 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-serving-cert\") pod \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.841547 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g6mg\" (UniqueName: \"kubernetes.io/projected/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-kube-api-access-2g6mg\") pod \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.841623 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-proxy-ca-bundles\") pod \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.841656 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-config\") pod \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\" (UID: \"415a984e-e3ba-4e39-adb2-f79c8ed05f3f\") " Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.842545 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "415a984e-e3ba-4e39-adb2-f79c8ed05f3f" (UID: "415a984e-e3ba-4e39-adb2-f79c8ed05f3f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.842709 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-client-ca" (OuterVolumeSpecName: "client-ca") pod "415a984e-e3ba-4e39-adb2-f79c8ed05f3f" (UID: "415a984e-e3ba-4e39-adb2-f79c8ed05f3f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.842763 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-config" (OuterVolumeSpecName: "config") pod "415a984e-e3ba-4e39-adb2-f79c8ed05f3f" (UID: "415a984e-e3ba-4e39-adb2-f79c8ed05f3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.848590 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-kube-api-access-2g6mg" (OuterVolumeSpecName: "kube-api-access-2g6mg") pod "415a984e-e3ba-4e39-adb2-f79c8ed05f3f" (UID: "415a984e-e3ba-4e39-adb2-f79c8ed05f3f"). InnerVolumeSpecName "kube-api-access-2g6mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.848718 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "415a984e-e3ba-4e39-adb2-f79c8ed05f3f" (UID: "415a984e-e3ba-4e39-adb2-f79c8ed05f3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.942619 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72fe0643-91a8-459e-aec7-257e5b07ea41-serving-cert\") pod \"72fe0643-91a8-459e-aec7-257e5b07ea41\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.942803 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72fe0643-91a8-459e-aec7-257e5b07ea41-config\") pod \"72fe0643-91a8-459e-aec7-257e5b07ea41\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.943601 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvzfj\" (UniqueName: \"kubernetes.io/projected/72fe0643-91a8-459e-aec7-257e5b07ea41-kube-api-access-fvzfj\") pod \"72fe0643-91a8-459e-aec7-257e5b07ea41\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.943673 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72fe0643-91a8-459e-aec7-257e5b07ea41-client-ca\") pod \"72fe0643-91a8-459e-aec7-257e5b07ea41\" (UID: \"72fe0643-91a8-459e-aec7-257e5b07ea41\") " Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.943941 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72fe0643-91a8-459e-aec7-257e5b07ea41-config" (OuterVolumeSpecName: "config") pod "72fe0643-91a8-459e-aec7-257e5b07ea41" (UID: "72fe0643-91a8-459e-aec7-257e5b07ea41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.944103 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.944202 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.944230 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.944248 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.944270 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g6mg\" (UniqueName: \"kubernetes.io/projected/415a984e-e3ba-4e39-adb2-f79c8ed05f3f-kube-api-access-2g6mg\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.944290 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72fe0643-91a8-459e-aec7-257e5b07ea41-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.944579 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72fe0643-91a8-459e-aec7-257e5b07ea41-client-ca" (OuterVolumeSpecName: "client-ca") pod "72fe0643-91a8-459e-aec7-257e5b07ea41" (UID: "72fe0643-91a8-459e-aec7-257e5b07ea41"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.947157 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72fe0643-91a8-459e-aec7-257e5b07ea41-kube-api-access-fvzfj" (OuterVolumeSpecName: "kube-api-access-fvzfj") pod "72fe0643-91a8-459e-aec7-257e5b07ea41" (UID: "72fe0643-91a8-459e-aec7-257e5b07ea41"). InnerVolumeSpecName "kube-api-access-fvzfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:33:43 crc kubenswrapper[4907]: I1129 14:33:43.948491 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72fe0643-91a8-459e-aec7-257e5b07ea41-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "72fe0643-91a8-459e-aec7-257e5b07ea41" (UID: "72fe0643-91a8-459e-aec7-257e5b07ea41"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.044897 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72fe0643-91a8-459e-aec7-257e5b07ea41-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.044958 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72fe0643-91a8-459e-aec7-257e5b07ea41-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.044975 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvzfj\" (UniqueName: \"kubernetes.io/projected/72fe0643-91a8-459e-aec7-257e5b07ea41-kube-api-access-fvzfj\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.170461 4907 generic.go:334] "Generic (PLEG): container finished" podID="72fe0643-91a8-459e-aec7-257e5b07ea41" containerID="e015939c8f741020f9a67a2e83d3feb7bc827756298dddbcbd94ce4503562162" exitCode=0 Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.170612 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.170760 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" event={"ID":"72fe0643-91a8-459e-aec7-257e5b07ea41","Type":"ContainerDied","Data":"e015939c8f741020f9a67a2e83d3feb7bc827756298dddbcbd94ce4503562162"} Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.171134 4907 scope.go:117] "RemoveContainer" containerID="e015939c8f741020f9a67a2e83d3feb7bc827756298dddbcbd94ce4503562162" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.171045 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr" event={"ID":"72fe0643-91a8-459e-aec7-257e5b07ea41","Type":"ContainerDied","Data":"d222cdfcde1854b9c7aca914be13e2f5f353ed421a3d6d984dfa6640cb6aaa69"} Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.171640 4907 generic.go:334] "Generic (PLEG): container finished" podID="415a984e-e3ba-4e39-adb2-f79c8ed05f3f" containerID="cb43dc0696c17bc0074ec14fc29159f1232bb49d52fdeba0f10652fd02c0aa9b" exitCode=0 Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.171681 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" event={"ID":"415a984e-e3ba-4e39-adb2-f79c8ed05f3f","Type":"ContainerDied","Data":"cb43dc0696c17bc0074ec14fc29159f1232bb49d52fdeba0f10652fd02c0aa9b"} Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.171709 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" event={"ID":"415a984e-e3ba-4e39-adb2-f79c8ed05f3f","Type":"ContainerDied","Data":"d2312c7f8967527eaed614713b0d7b7d5dc820425f8ea3f3af9f83507890de92"} Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.171746 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gpnkx" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.201037 4907 scope.go:117] "RemoveContainer" containerID="e015939c8f741020f9a67a2e83d3feb7bc827756298dddbcbd94ce4503562162" Nov 29 14:33:44 crc kubenswrapper[4907]: E1129 14:33:44.202163 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e015939c8f741020f9a67a2e83d3feb7bc827756298dddbcbd94ce4503562162\": container with ID starting with e015939c8f741020f9a67a2e83d3feb7bc827756298dddbcbd94ce4503562162 not found: ID does not exist" containerID="e015939c8f741020f9a67a2e83d3feb7bc827756298dddbcbd94ce4503562162" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.202223 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e015939c8f741020f9a67a2e83d3feb7bc827756298dddbcbd94ce4503562162"} err="failed to get container status \"e015939c8f741020f9a67a2e83d3feb7bc827756298dddbcbd94ce4503562162\": rpc error: code = NotFound desc = could not find container \"e015939c8f741020f9a67a2e83d3feb7bc827756298dddbcbd94ce4503562162\": container with ID starting with e015939c8f741020f9a67a2e83d3feb7bc827756298dddbcbd94ce4503562162 not found: ID does not exist" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.202252 4907 scope.go:117] "RemoveContainer" containerID="cb43dc0696c17bc0074ec14fc29159f1232bb49d52fdeba0f10652fd02c0aa9b" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.221152 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gpnkx"] Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.228016 4907 scope.go:117] "RemoveContainer" containerID="cb43dc0696c17bc0074ec14fc29159f1232bb49d52fdeba0f10652fd02c0aa9b" Nov 29 14:33:44 crc kubenswrapper[4907]: E1129 14:33:44.231632 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb43dc0696c17bc0074ec14fc29159f1232bb49d52fdeba0f10652fd02c0aa9b\": container with ID starting with cb43dc0696c17bc0074ec14fc29159f1232bb49d52fdeba0f10652fd02c0aa9b not found: ID does not exist" containerID="cb43dc0696c17bc0074ec14fc29159f1232bb49d52fdeba0f10652fd02c0aa9b" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.231711 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb43dc0696c17bc0074ec14fc29159f1232bb49d52fdeba0f10652fd02c0aa9b"} err="failed to get container status \"cb43dc0696c17bc0074ec14fc29159f1232bb49d52fdeba0f10652fd02c0aa9b\": rpc error: code = NotFound desc = could not find container \"cb43dc0696c17bc0074ec14fc29159f1232bb49d52fdeba0f10652fd02c0aa9b\": container with ID starting with cb43dc0696c17bc0074ec14fc29159f1232bb49d52fdeba0f10652fd02c0aa9b not found: ID does not exist" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.232362 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gpnkx"] Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.260522 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr"] Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.271732 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-q2sxr"] Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.485997 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="415a984e-e3ba-4e39-adb2-f79c8ed05f3f" path="/var/lib/kubelet/pods/415a984e-e3ba-4e39-adb2-f79c8ed05f3f/volumes" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.486531 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72fe0643-91a8-459e-aec7-257e5b07ea41" path="/var/lib/kubelet/pods/72fe0643-91a8-459e-aec7-257e5b07ea41/volumes" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.864959 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-55b7d96bcc-9sknj"] Nov 29 14:33:44 crc kubenswrapper[4907]: E1129 14:33:44.865150 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="415a984e-e3ba-4e39-adb2-f79c8ed05f3f" containerName="controller-manager" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.865161 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="415a984e-e3ba-4e39-adb2-f79c8ed05f3f" containerName="controller-manager" Nov 29 14:33:44 crc kubenswrapper[4907]: E1129 14:33:44.865176 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72fe0643-91a8-459e-aec7-257e5b07ea41" containerName="route-controller-manager" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.865182 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="72fe0643-91a8-459e-aec7-257e5b07ea41" containerName="route-controller-manager" Nov 29 14:33:44 crc kubenswrapper[4907]: E1129 14:33:44.865189 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.865196 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.865283 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.865293 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="72fe0643-91a8-459e-aec7-257e5b07ea41" containerName="route-controller-manager" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.865305 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="415a984e-e3ba-4e39-adb2-f79c8ed05f3f" containerName="controller-manager" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.865646 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.869586 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.869705 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.870058 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.870174 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.878770 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.882748 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55b7d96bcc-9sknj"] Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.883981 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.885853 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.892987 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7"] Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.893627 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.903773 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.908153 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.908256 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.908330 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.908474 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.908294 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.924763 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7"] Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.952126 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7"] Nov 29 14:33:44 crc kubenswrapper[4907]: E1129 14:33:44.952923 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-b8kx9 serving-cert], unattached volumes=[], failed to process volumes=[client-ca config kube-api-access-b8kx9 serving-cert]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" podUID="89a6b7c3-6a02-48ce-8eea-96b880408a63" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.959323 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttp7s\" (UniqueName: \"kubernetes.io/projected/6f6a95a6-d143-4a44-8850-7ab2959a767c-kube-api-access-ttp7s\") pod \"controller-manager-55b7d96bcc-9sknj\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.959366 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-proxy-ca-bundles\") pod \"controller-manager-55b7d96bcc-9sknj\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.959470 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6a95a6-d143-4a44-8850-7ab2959a767c-serving-cert\") pod \"controller-manager-55b7d96bcc-9sknj\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.959520 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-client-ca\") pod \"controller-manager-55b7d96bcc-9sknj\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.959548 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-config\") pod \"controller-manager-55b7d96bcc-9sknj\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.959589 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89a6b7c3-6a02-48ce-8eea-96b880408a63-client-ca\") pod \"route-controller-manager-674db754b8-xjcb7\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.959616 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a6b7c3-6a02-48ce-8eea-96b880408a63-config\") pod \"route-controller-manager-674db754b8-xjcb7\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.959644 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8kx9\" (UniqueName: \"kubernetes.io/projected/89a6b7c3-6a02-48ce-8eea-96b880408a63-kube-api-access-b8kx9\") pod \"route-controller-manager-674db754b8-xjcb7\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:44 crc kubenswrapper[4907]: I1129 14:33:44.959665 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89a6b7c3-6a02-48ce-8eea-96b880408a63-serving-cert\") pod \"route-controller-manager-674db754b8-xjcb7\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.061637 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8kx9\" (UniqueName: \"kubernetes.io/projected/89a6b7c3-6a02-48ce-8eea-96b880408a63-kube-api-access-b8kx9\") pod \"route-controller-manager-674db754b8-xjcb7\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.061693 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89a6b7c3-6a02-48ce-8eea-96b880408a63-serving-cert\") pod \"route-controller-manager-674db754b8-xjcb7\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.061730 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttp7s\" (UniqueName: \"kubernetes.io/projected/6f6a95a6-d143-4a44-8850-7ab2959a767c-kube-api-access-ttp7s\") pod \"controller-manager-55b7d96bcc-9sknj\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.061746 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-proxy-ca-bundles\") pod \"controller-manager-55b7d96bcc-9sknj\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.061786 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6a95a6-d143-4a44-8850-7ab2959a767c-serving-cert\") pod \"controller-manager-55b7d96bcc-9sknj\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.061816 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-client-ca\") pod \"controller-manager-55b7d96bcc-9sknj\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.061838 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-config\") pod \"controller-manager-55b7d96bcc-9sknj\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.061868 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89a6b7c3-6a02-48ce-8eea-96b880408a63-client-ca\") pod \"route-controller-manager-674db754b8-xjcb7\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.061895 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a6b7c3-6a02-48ce-8eea-96b880408a63-config\") pod \"route-controller-manager-674db754b8-xjcb7\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.063004 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a6b7c3-6a02-48ce-8eea-96b880408a63-config\") pod \"route-controller-manager-674db754b8-xjcb7\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.064549 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89a6b7c3-6a02-48ce-8eea-96b880408a63-client-ca\") pod \"route-controller-manager-674db754b8-xjcb7\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.064911 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-client-ca\") pod \"controller-manager-55b7d96bcc-9sknj\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.065490 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-config\") pod \"controller-manager-55b7d96bcc-9sknj\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.065636 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-proxy-ca-bundles\") pod \"controller-manager-55b7d96bcc-9sknj\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.068631 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89a6b7c3-6a02-48ce-8eea-96b880408a63-serving-cert\") pod \"route-controller-manager-674db754b8-xjcb7\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.075954 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6a95a6-d143-4a44-8850-7ab2959a767c-serving-cert\") pod \"controller-manager-55b7d96bcc-9sknj\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.081135 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8kx9\" (UniqueName: \"kubernetes.io/projected/89a6b7c3-6a02-48ce-8eea-96b880408a63-kube-api-access-b8kx9\") pod \"route-controller-manager-674db754b8-xjcb7\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.081280 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttp7s\" (UniqueName: \"kubernetes.io/projected/6f6a95a6-d143-4a44-8850-7ab2959a767c-kube-api-access-ttp7s\") pod \"controller-manager-55b7d96bcc-9sknj\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.141089 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.179466 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.184041 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.189409 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.264030 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89a6b7c3-6a02-48ce-8eea-96b880408a63-serving-cert\") pod \"89a6b7c3-6a02-48ce-8eea-96b880408a63\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.264112 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89a6b7c3-6a02-48ce-8eea-96b880408a63-client-ca\") pod \"89a6b7c3-6a02-48ce-8eea-96b880408a63\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.264156 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a6b7c3-6a02-48ce-8eea-96b880408a63-config\") pod \"89a6b7c3-6a02-48ce-8eea-96b880408a63\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.264205 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8kx9\" (UniqueName: \"kubernetes.io/projected/89a6b7c3-6a02-48ce-8eea-96b880408a63-kube-api-access-b8kx9\") pod \"89a6b7c3-6a02-48ce-8eea-96b880408a63\" (UID: \"89a6b7c3-6a02-48ce-8eea-96b880408a63\") " Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.264794 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a6b7c3-6a02-48ce-8eea-96b880408a63-client-ca" (OuterVolumeSpecName: "client-ca") pod "89a6b7c3-6a02-48ce-8eea-96b880408a63" (UID: "89a6b7c3-6a02-48ce-8eea-96b880408a63"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.264809 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a6b7c3-6a02-48ce-8eea-96b880408a63-config" (OuterVolumeSpecName: "config") pod "89a6b7c3-6a02-48ce-8eea-96b880408a63" (UID: "89a6b7c3-6a02-48ce-8eea-96b880408a63"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.271355 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89a6b7c3-6a02-48ce-8eea-96b880408a63-kube-api-access-b8kx9" (OuterVolumeSpecName: "kube-api-access-b8kx9") pod "89a6b7c3-6a02-48ce-8eea-96b880408a63" (UID: "89a6b7c3-6a02-48ce-8eea-96b880408a63"). InnerVolumeSpecName "kube-api-access-b8kx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.271941 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a6b7c3-6a02-48ce-8eea-96b880408a63-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "89a6b7c3-6a02-48ce-8eea-96b880408a63" (UID: "89a6b7c3-6a02-48ce-8eea-96b880408a63"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.365886 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89a6b7c3-6a02-48ce-8eea-96b880408a63-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.365937 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89a6b7c3-6a02-48ce-8eea-96b880408a63-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.365951 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a6b7c3-6a02-48ce-8eea-96b880408a63-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.365965 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8kx9\" (UniqueName: \"kubernetes.io/projected/89a6b7c3-6a02-48ce-8eea-96b880408a63-kube-api-access-b8kx9\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.412670 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55b7d96bcc-9sknj"] Nov 29 14:33:45 crc kubenswrapper[4907]: I1129 14:33:45.447410 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55b7d96bcc-9sknj"] Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.187686 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.188046 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" podUID="6f6a95a6-d143-4a44-8850-7ab2959a767c" containerName="controller-manager" containerID="cri-o://8f0efc7b69ffcbcc31ccd83469f0db528a847fa9d346d15c3aec5099db157ad8" gracePeriod=30 Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.187707 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" event={"ID":"6f6a95a6-d143-4a44-8850-7ab2959a767c","Type":"ContainerStarted","Data":"8f0efc7b69ffcbcc31ccd83469f0db528a847fa9d346d15c3aec5099db157ad8"} Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.188103 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.188113 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" event={"ID":"6f6a95a6-d143-4a44-8850-7ab2959a767c","Type":"ContainerStarted","Data":"e451c050f0dfce54fb88d9a628c360b6887cf1feb93dd1d28157f084860b9b87"} Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.192859 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.216328 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" podStartSLOduration=2.216304057 podStartE2EDuration="2.216304057s" podCreationTimestamp="2025-11-29 14:33:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:33:46.212570231 +0000 UTC m=+324.199407893" watchObservedRunningTime="2025-11-29 14:33:46.216304057 +0000 UTC m=+324.203141709" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.237876 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7"] Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.250662 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674db754b8-xjcb7"] Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.485629 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89a6b7c3-6a02-48ce-8eea-96b880408a63" path="/var/lib/kubelet/pods/89a6b7c3-6a02-48ce-8eea-96b880408a63/volumes" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.535785 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.581816 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6a95a6-d143-4a44-8850-7ab2959a767c-serving-cert\") pod \"6f6a95a6-d143-4a44-8850-7ab2959a767c\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.581870 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttp7s\" (UniqueName: \"kubernetes.io/projected/6f6a95a6-d143-4a44-8850-7ab2959a767c-kube-api-access-ttp7s\") pod \"6f6a95a6-d143-4a44-8850-7ab2959a767c\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.581941 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-client-ca\") pod \"6f6a95a6-d143-4a44-8850-7ab2959a767c\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.581963 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-proxy-ca-bundles\") pod \"6f6a95a6-d143-4a44-8850-7ab2959a767c\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.581990 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-config\") pod \"6f6a95a6-d143-4a44-8850-7ab2959a767c\" (UID: \"6f6a95a6-d143-4a44-8850-7ab2959a767c\") " Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.582878 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-client-ca" (OuterVolumeSpecName: "client-ca") pod "6f6a95a6-d143-4a44-8850-7ab2959a767c" (UID: "6f6a95a6-d143-4a44-8850-7ab2959a767c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.583038 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-config" (OuterVolumeSpecName: "config") pod "6f6a95a6-d143-4a44-8850-7ab2959a767c" (UID: "6f6a95a6-d143-4a44-8850-7ab2959a767c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.583213 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6f6a95a6-d143-4a44-8850-7ab2959a767c" (UID: "6f6a95a6-d143-4a44-8850-7ab2959a767c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.584581 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.584602 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.584612 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f6a95a6-d143-4a44-8850-7ab2959a767c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.600941 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f6a95a6-d143-4a44-8850-7ab2959a767c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6f6a95a6-d143-4a44-8850-7ab2959a767c" (UID: "6f6a95a6-d143-4a44-8850-7ab2959a767c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.601059 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f6a95a6-d143-4a44-8850-7ab2959a767c-kube-api-access-ttp7s" (OuterVolumeSpecName: "kube-api-access-ttp7s") pod "6f6a95a6-d143-4a44-8850-7ab2959a767c" (UID: "6f6a95a6-d143-4a44-8850-7ab2959a767c"). InnerVolumeSpecName "kube-api-access-ttp7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.685862 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttp7s\" (UniqueName: \"kubernetes.io/projected/6f6a95a6-d143-4a44-8850-7ab2959a767c-kube-api-access-ttp7s\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.685907 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6a95a6-d143-4a44-8850-7ab2959a767c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.892222 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-586fc87d4f-7chp8"] Nov 29 14:33:46 crc kubenswrapper[4907]: E1129 14:33:46.892620 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f6a95a6-d143-4a44-8850-7ab2959a767c" containerName="controller-manager" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.892642 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6a95a6-d143-4a44-8850-7ab2959a767c" containerName="controller-manager" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.892770 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f6a95a6-d143-4a44-8850-7ab2959a767c" containerName="controller-manager" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.893424 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.895685 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw"] Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.896776 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.901949 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.902044 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.902328 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.902419 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.902529 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.902687 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.916070 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw"] Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.922520 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-586fc87d4f-7chp8"] Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.988505 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsv6t\" (UniqueName: \"kubernetes.io/projected/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-kube-api-access-vsv6t\") pod \"controller-manager-586fc87d4f-7chp8\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.988554 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-proxy-ca-bundles\") pod \"controller-manager-586fc87d4f-7chp8\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.988584 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swmc5\" (UniqueName: \"kubernetes.io/projected/7b8c3a33-ee77-440d-a9ff-62b9741ead78-kube-api-access-swmc5\") pod \"route-controller-manager-5fcf54cb6f-5r5xw\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.988607 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-serving-cert\") pod \"controller-manager-586fc87d4f-7chp8\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.988646 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8c3a33-ee77-440d-a9ff-62b9741ead78-config\") pod \"route-controller-manager-5fcf54cb6f-5r5xw\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.988666 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b8c3a33-ee77-440d-a9ff-62b9741ead78-serving-cert\") pod \"route-controller-manager-5fcf54cb6f-5r5xw\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.988685 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b8c3a33-ee77-440d-a9ff-62b9741ead78-client-ca\") pod \"route-controller-manager-5fcf54cb6f-5r5xw\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.988711 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-client-ca\") pod \"controller-manager-586fc87d4f-7chp8\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:46 crc kubenswrapper[4907]: I1129 14:33:46.988754 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-config\") pod \"controller-manager-586fc87d4f-7chp8\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.089492 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swmc5\" (UniqueName: \"kubernetes.io/projected/7b8c3a33-ee77-440d-a9ff-62b9741ead78-kube-api-access-swmc5\") pod \"route-controller-manager-5fcf54cb6f-5r5xw\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.089786 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-serving-cert\") pod \"controller-manager-586fc87d4f-7chp8\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.089915 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8c3a33-ee77-440d-a9ff-62b9741ead78-config\") pod \"route-controller-manager-5fcf54cb6f-5r5xw\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.090023 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b8c3a33-ee77-440d-a9ff-62b9741ead78-serving-cert\") pod \"route-controller-manager-5fcf54cb6f-5r5xw\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.090126 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b8c3a33-ee77-440d-a9ff-62b9741ead78-client-ca\") pod \"route-controller-manager-5fcf54cb6f-5r5xw\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.090229 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-client-ca\") pod \"controller-manager-586fc87d4f-7chp8\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.090319 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-config\") pod \"controller-manager-586fc87d4f-7chp8\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.090404 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsv6t\" (UniqueName: \"kubernetes.io/projected/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-kube-api-access-vsv6t\") pod \"controller-manager-586fc87d4f-7chp8\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.090518 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-proxy-ca-bundles\") pod \"controller-manager-586fc87d4f-7chp8\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.091093 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b8c3a33-ee77-440d-a9ff-62b9741ead78-client-ca\") pod \"route-controller-manager-5fcf54cb6f-5r5xw\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.091364 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-client-ca\") pod \"controller-manager-586fc87d4f-7chp8\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.091415 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8c3a33-ee77-440d-a9ff-62b9741ead78-config\") pod \"route-controller-manager-5fcf54cb6f-5r5xw\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.091724 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-config\") pod \"controller-manager-586fc87d4f-7chp8\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.091848 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-proxy-ca-bundles\") pod \"controller-manager-586fc87d4f-7chp8\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.094662 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b8c3a33-ee77-440d-a9ff-62b9741ead78-serving-cert\") pod \"route-controller-manager-5fcf54cb6f-5r5xw\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.103049 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-serving-cert\") pod \"controller-manager-586fc87d4f-7chp8\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.105642 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swmc5\" (UniqueName: \"kubernetes.io/projected/7b8c3a33-ee77-440d-a9ff-62b9741ead78-kube-api-access-swmc5\") pod \"route-controller-manager-5fcf54cb6f-5r5xw\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.107056 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsv6t\" (UniqueName: \"kubernetes.io/projected/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-kube-api-access-vsv6t\") pod \"controller-manager-586fc87d4f-7chp8\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.221579 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.227696 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.303278 4907 generic.go:334] "Generic (PLEG): container finished" podID="6f6a95a6-d143-4a44-8850-7ab2959a767c" containerID="8f0efc7b69ffcbcc31ccd83469f0db528a847fa9d346d15c3aec5099db157ad8" exitCode=0 Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.303331 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" event={"ID":"6f6a95a6-d143-4a44-8850-7ab2959a767c","Type":"ContainerDied","Data":"8f0efc7b69ffcbcc31ccd83469f0db528a847fa9d346d15c3aec5099db157ad8"} Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.303724 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" event={"ID":"6f6a95a6-d143-4a44-8850-7ab2959a767c","Type":"ContainerDied","Data":"e451c050f0dfce54fb88d9a628c360b6887cf1feb93dd1d28157f084860b9b87"} Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.303746 4907 scope.go:117] "RemoveContainer" containerID="8f0efc7b69ffcbcc31ccd83469f0db528a847fa9d346d15c3aec5099db157ad8" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.303389 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55b7d96bcc-9sknj" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.338659 4907 scope.go:117] "RemoveContainer" containerID="8f0efc7b69ffcbcc31ccd83469f0db528a847fa9d346d15c3aec5099db157ad8" Nov 29 14:33:47 crc kubenswrapper[4907]: E1129 14:33:47.340454 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f0efc7b69ffcbcc31ccd83469f0db528a847fa9d346d15c3aec5099db157ad8\": container with ID starting with 8f0efc7b69ffcbcc31ccd83469f0db528a847fa9d346d15c3aec5099db157ad8 not found: ID does not exist" containerID="8f0efc7b69ffcbcc31ccd83469f0db528a847fa9d346d15c3aec5099db157ad8" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.340500 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f0efc7b69ffcbcc31ccd83469f0db528a847fa9d346d15c3aec5099db157ad8"} err="failed to get container status \"8f0efc7b69ffcbcc31ccd83469f0db528a847fa9d346d15c3aec5099db157ad8\": rpc error: code = NotFound desc = could not find container \"8f0efc7b69ffcbcc31ccd83469f0db528a847fa9d346d15c3aec5099db157ad8\": container with ID starting with 8f0efc7b69ffcbcc31ccd83469f0db528a847fa9d346d15c3aec5099db157ad8 not found: ID does not exist" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.349967 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55b7d96bcc-9sknj"] Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.359935 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-55b7d96bcc-9sknj"] Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.367486 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.558990 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-586fc87d4f-7chp8"] Nov 29 14:33:47 crc kubenswrapper[4907]: W1129 14:33:47.568406 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42ed5aac_d0b1_4f5f_bcac_8eba46ee68c3.slice/crio-a7908a1f014b160745ed403ea0da2759c74cb4a44ef0561a4a48528f4dd2a8d5 WatchSource:0}: Error finding container a7908a1f014b160745ed403ea0da2759c74cb4a44ef0561a4a48528f4dd2a8d5: Status 404 returned error can't find the container with id a7908a1f014b160745ed403ea0da2759c74cb4a44ef0561a4a48528f4dd2a8d5 Nov 29 14:33:47 crc kubenswrapper[4907]: I1129 14:33:47.621960 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw"] Nov 29 14:33:47 crc kubenswrapper[4907]: W1129 14:33:47.642609 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b8c3a33_ee77_440d_a9ff_62b9741ead78.slice/crio-956def6ab5cd8bf8f213d7501fdd860cecdae617c16dcf9acdee5ee55a46027d WatchSource:0}: Error finding container 956def6ab5cd8bf8f213d7501fdd860cecdae617c16dcf9acdee5ee55a46027d: Status 404 returned error can't find the container with id 956def6ab5cd8bf8f213d7501fdd860cecdae617c16dcf9acdee5ee55a46027d Nov 29 14:33:48 crc kubenswrapper[4907]: I1129 14:33:48.081634 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 29 14:33:48 crc kubenswrapper[4907]: I1129 14:33:48.310493 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" event={"ID":"7b8c3a33-ee77-440d-a9ff-62b9741ead78","Type":"ContainerStarted","Data":"5b0fab963347ff63b0f170b94e7bf99466079d07542447254c25bd27436a6c26"} Nov 29 14:33:48 crc kubenswrapper[4907]: I1129 14:33:48.310553 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" event={"ID":"7b8c3a33-ee77-440d-a9ff-62b9741ead78","Type":"ContainerStarted","Data":"956def6ab5cd8bf8f213d7501fdd860cecdae617c16dcf9acdee5ee55a46027d"} Nov 29 14:33:48 crc kubenswrapper[4907]: I1129 14:33:48.310703 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:48 crc kubenswrapper[4907]: I1129 14:33:48.314465 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" event={"ID":"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3","Type":"ContainerStarted","Data":"308211387e75e0a666ad266b6f50eba88292b0137dd544e89ae64e547f4287be"} Nov 29 14:33:48 crc kubenswrapper[4907]: I1129 14:33:48.314514 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" event={"ID":"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3","Type":"ContainerStarted","Data":"a7908a1f014b160745ed403ea0da2759c74cb4a44ef0561a4a48528f4dd2a8d5"} Nov 29 14:33:48 crc kubenswrapper[4907]: I1129 14:33:48.314821 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:48 crc kubenswrapper[4907]: I1129 14:33:48.320258 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:33:48 crc kubenswrapper[4907]: I1129 14:33:48.348682 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" podStartSLOduration=3.3486666290000002 podStartE2EDuration="3.348666629s" podCreationTimestamp="2025-11-29 14:33:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:33:48.347605616 +0000 UTC m=+326.334443268" watchObservedRunningTime="2025-11-29 14:33:48.348666629 +0000 UTC m=+326.335504281" Nov 29 14:33:48 crc kubenswrapper[4907]: I1129 14:33:48.352207 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" podStartSLOduration=3.352200858 podStartE2EDuration="3.352200858s" podCreationTimestamp="2025-11-29 14:33:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:33:48.328315799 +0000 UTC m=+326.315153451" watchObservedRunningTime="2025-11-29 14:33:48.352200858 +0000 UTC m=+326.339038510" Nov 29 14:33:48 crc kubenswrapper[4907]: I1129 14:33:48.486418 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f6a95a6-d143-4a44-8850-7ab2959a767c" path="/var/lib/kubelet/pods/6f6a95a6-d143-4a44-8850-7ab2959a767c/volumes" Nov 29 14:33:48 crc kubenswrapper[4907]: I1129 14:33:48.526183 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:33:51 crc kubenswrapper[4907]: I1129 14:33:51.899639 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 29 14:33:52 crc kubenswrapper[4907]: I1129 14:33:52.394942 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 29 14:34:00 crc kubenswrapper[4907]: I1129 14:34:00.161552 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 29 14:34:03 crc kubenswrapper[4907]: I1129 14:34:03.375371 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-586fc87d4f-7chp8"] Nov 29 14:34:03 crc kubenswrapper[4907]: I1129 14:34:03.375911 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" podUID="42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3" containerName="controller-manager" containerID="cri-o://308211387e75e0a666ad266b6f50eba88292b0137dd544e89ae64e547f4287be" gracePeriod=30 Nov 29 14:34:03 crc kubenswrapper[4907]: I1129 14:34:03.392292 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw"] Nov 29 14:34:03 crc kubenswrapper[4907]: I1129 14:34:03.392891 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" podUID="7b8c3a33-ee77-440d-a9ff-62b9741ead78" containerName="route-controller-manager" containerID="cri-o://5b0fab963347ff63b0f170b94e7bf99466079d07542447254c25bd27436a6c26" gracePeriod=30 Nov 29 14:34:03 crc kubenswrapper[4907]: I1129 14:34:03.807946 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.425040 4907 generic.go:334] "Generic (PLEG): container finished" podID="42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3" containerID="308211387e75e0a666ad266b6f50eba88292b0137dd544e89ae64e547f4287be" exitCode=0 Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.425143 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" event={"ID":"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3","Type":"ContainerDied","Data":"308211387e75e0a666ad266b6f50eba88292b0137dd544e89ae64e547f4287be"} Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.427032 4907 generic.go:334] "Generic (PLEG): container finished" podID="7b8c3a33-ee77-440d-a9ff-62b9741ead78" containerID="5b0fab963347ff63b0f170b94e7bf99466079d07542447254c25bd27436a6c26" exitCode=0 Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.427064 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" event={"ID":"7b8c3a33-ee77-440d-a9ff-62b9741ead78","Type":"ContainerDied","Data":"5b0fab963347ff63b0f170b94e7bf99466079d07542447254c25bd27436a6c26"} Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.544530 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.548670 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.567575 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8"] Nov 29 14:34:04 crc kubenswrapper[4907]: E1129 14:34:04.567769 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8c3a33-ee77-440d-a9ff-62b9741ead78" containerName="route-controller-manager" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.567781 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8c3a33-ee77-440d-a9ff-62b9741ead78" containerName="route-controller-manager" Nov 29 14:34:04 crc kubenswrapper[4907]: E1129 14:34:04.567794 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3" containerName="controller-manager" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.567800 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3" containerName="controller-manager" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.567891 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3" containerName="controller-manager" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.567903 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b8c3a33-ee77-440d-a9ff-62b9741ead78" containerName="route-controller-manager" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.568227 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.585938 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8"] Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.662880 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b8c3a33-ee77-440d-a9ff-62b9741ead78-serving-cert\") pod \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.663750 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsv6t\" (UniqueName: \"kubernetes.io/projected/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-kube-api-access-vsv6t\") pod \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.663855 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-config\") pod \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.663878 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-proxy-ca-bundles\") pod \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.663981 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b8c3a33-ee77-440d-a9ff-62b9741ead78-client-ca\") pod \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.664034 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-serving-cert\") pod \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.664095 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-client-ca\") pod \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\" (UID: \"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3\") " Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.664152 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swmc5\" (UniqueName: \"kubernetes.io/projected/7b8c3a33-ee77-440d-a9ff-62b9741ead78-kube-api-access-swmc5\") pod \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.664194 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8c3a33-ee77-440d-a9ff-62b9741ead78-config\") pod \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\" (UID: \"7b8c3a33-ee77-440d-a9ff-62b9741ead78\") " Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.664577 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpt8q\" (UniqueName: \"kubernetes.io/projected/b44d1b44-fe10-41c0-ac20-7dd5363f1e1c-kube-api-access-zpt8q\") pod \"route-controller-manager-5fc8dfd9f9-jf7g8\" (UID: \"b44d1b44-fe10-41c0-ac20-7dd5363f1e1c\") " pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.664639 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b44d1b44-fe10-41c0-ac20-7dd5363f1e1c-config\") pod \"route-controller-manager-5fc8dfd9f9-jf7g8\" (UID: \"b44d1b44-fe10-41c0-ac20-7dd5363f1e1c\") " pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.664642 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8c3a33-ee77-440d-a9ff-62b9741ead78-client-ca" (OuterVolumeSpecName: "client-ca") pod "7b8c3a33-ee77-440d-a9ff-62b9741ead78" (UID: "7b8c3a33-ee77-440d-a9ff-62b9741ead78"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.664662 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3" (UID: "42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.664728 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-config" (OuterVolumeSpecName: "config") pod "42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3" (UID: "42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.665327 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-client-ca" (OuterVolumeSpecName: "client-ca") pod "42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3" (UID: "42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.665389 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b44d1b44-fe10-41c0-ac20-7dd5363f1e1c-client-ca\") pod \"route-controller-manager-5fc8dfd9f9-jf7g8\" (UID: \"b44d1b44-fe10-41c0-ac20-7dd5363f1e1c\") " pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.665519 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b44d1b44-fe10-41c0-ac20-7dd5363f1e1c-serving-cert\") pod \"route-controller-manager-5fc8dfd9f9-jf7g8\" (UID: \"b44d1b44-fe10-41c0-ac20-7dd5363f1e1c\") " pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.665616 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.665640 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.665650 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.665659 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b8c3a33-ee77-440d-a9ff-62b9741ead78-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.665667 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8c3a33-ee77-440d-a9ff-62b9741ead78-config" (OuterVolumeSpecName: "config") pod "7b8c3a33-ee77-440d-a9ff-62b9741ead78" (UID: "7b8c3a33-ee77-440d-a9ff-62b9741ead78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.670675 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b8c3a33-ee77-440d-a9ff-62b9741ead78-kube-api-access-swmc5" (OuterVolumeSpecName: "kube-api-access-swmc5") pod "7b8c3a33-ee77-440d-a9ff-62b9741ead78" (UID: "7b8c3a33-ee77-440d-a9ff-62b9741ead78"). InnerVolumeSpecName "kube-api-access-swmc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.670845 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b8c3a33-ee77-440d-a9ff-62b9741ead78-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7b8c3a33-ee77-440d-a9ff-62b9741ead78" (UID: "7b8c3a33-ee77-440d-a9ff-62b9741ead78"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.671536 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3" (UID: "42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.675644 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-kube-api-access-vsv6t" (OuterVolumeSpecName: "kube-api-access-vsv6t") pod "42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3" (UID: "42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3"). InnerVolumeSpecName "kube-api-access-vsv6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.767589 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpt8q\" (UniqueName: \"kubernetes.io/projected/b44d1b44-fe10-41c0-ac20-7dd5363f1e1c-kube-api-access-zpt8q\") pod \"route-controller-manager-5fc8dfd9f9-jf7g8\" (UID: \"b44d1b44-fe10-41c0-ac20-7dd5363f1e1c\") " pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.767694 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b44d1b44-fe10-41c0-ac20-7dd5363f1e1c-config\") pod \"route-controller-manager-5fc8dfd9f9-jf7g8\" (UID: \"b44d1b44-fe10-41c0-ac20-7dd5363f1e1c\") " pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.767767 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b44d1b44-fe10-41c0-ac20-7dd5363f1e1c-client-ca\") pod \"route-controller-manager-5fc8dfd9f9-jf7g8\" (UID: \"b44d1b44-fe10-41c0-ac20-7dd5363f1e1c\") " pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.767823 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b44d1b44-fe10-41c0-ac20-7dd5363f1e1c-serving-cert\") pod \"route-controller-manager-5fc8dfd9f9-jf7g8\" (UID: \"b44d1b44-fe10-41c0-ac20-7dd5363f1e1c\") " pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.768065 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.768241 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swmc5\" (UniqueName: \"kubernetes.io/projected/7b8c3a33-ee77-440d-a9ff-62b9741ead78-kube-api-access-swmc5\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.768281 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8c3a33-ee77-440d-a9ff-62b9741ead78-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.768303 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsv6t\" (UniqueName: \"kubernetes.io/projected/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3-kube-api-access-vsv6t\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.768323 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b8c3a33-ee77-440d-a9ff-62b9741ead78-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.769006 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b44d1b44-fe10-41c0-ac20-7dd5363f1e1c-client-ca\") pod \"route-controller-manager-5fc8dfd9f9-jf7g8\" (UID: \"b44d1b44-fe10-41c0-ac20-7dd5363f1e1c\") " pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.770478 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b44d1b44-fe10-41c0-ac20-7dd5363f1e1c-config\") pod \"route-controller-manager-5fc8dfd9f9-jf7g8\" (UID: \"b44d1b44-fe10-41c0-ac20-7dd5363f1e1c\") " pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.777119 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b44d1b44-fe10-41c0-ac20-7dd5363f1e1c-serving-cert\") pod \"route-controller-manager-5fc8dfd9f9-jf7g8\" (UID: \"b44d1b44-fe10-41c0-ac20-7dd5363f1e1c\") " pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.784088 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpt8q\" (UniqueName: \"kubernetes.io/projected/b44d1b44-fe10-41c0-ac20-7dd5363f1e1c-kube-api-access-zpt8q\") pod \"route-controller-manager-5fc8dfd9f9-jf7g8\" (UID: \"b44d1b44-fe10-41c0-ac20-7dd5363f1e1c\") " pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:04 crc kubenswrapper[4907]: I1129 14:34:04.883288 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.178318 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8"] Nov 29 14:34:05 crc kubenswrapper[4907]: W1129 14:34:05.184136 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb44d1b44_fe10_41c0_ac20_7dd5363f1e1c.slice/crio-0733f6851d9fe0e6c2a83d2833ab5f537cb01c7ffc465e276accf2cecf230055 WatchSource:0}: Error finding container 0733f6851d9fe0e6c2a83d2833ab5f537cb01c7ffc465e276accf2cecf230055: Status 404 returned error can't find the container with id 0733f6851d9fe0e6c2a83d2833ab5f537cb01c7ffc465e276accf2cecf230055 Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.363896 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vqj4q"] Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.365064 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.386720 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vqj4q"] Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.434696 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" event={"ID":"42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3","Type":"ContainerDied","Data":"a7908a1f014b160745ed403ea0da2759c74cb4a44ef0561a4a48528f4dd2a8d5"} Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.434760 4907 scope.go:117] "RemoveContainer" containerID="308211387e75e0a666ad266b6f50eba88292b0137dd544e89ae64e547f4287be" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.434716 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-586fc87d4f-7chp8" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.436066 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" event={"ID":"b44d1b44-fe10-41c0-ac20-7dd5363f1e1c","Type":"ContainerStarted","Data":"2b5831b82fcee0cbb3b7d8419daad6099ca75a8dadbe437f61184aa3e0e4d74a"} Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.436107 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" event={"ID":"b44d1b44-fe10-41c0-ac20-7dd5363f1e1c","Type":"ContainerStarted","Data":"0733f6851d9fe0e6c2a83d2833ab5f537cb01c7ffc465e276accf2cecf230055"} Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.436222 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.439184 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" event={"ID":"7b8c3a33-ee77-440d-a9ff-62b9741ead78","Type":"ContainerDied","Data":"956def6ab5cd8bf8f213d7501fdd860cecdae617c16dcf9acdee5ee55a46027d"} Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.439452 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.448616 4907 scope.go:117] "RemoveContainer" containerID="5b0fab963347ff63b0f170b94e7bf99466079d07542447254c25bd27436a6c26" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.469381 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" podStartSLOduration=2.469355823 podStartE2EDuration="2.469355823s" podCreationTimestamp="2025-11-29 14:34:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:34:05.466886586 +0000 UTC m=+343.453724238" watchObservedRunningTime="2025-11-29 14:34:05.469355823 +0000 UTC m=+343.456193465" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.476279 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-trusted-ca\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.476459 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.476591 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbmm9\" (UniqueName: \"kubernetes.io/projected/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-kube-api-access-pbmm9\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.477051 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.477186 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-registry-tls\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.477320 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-registry-certificates\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.477398 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-bound-sa-token\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.477497 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.480467 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw"] Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.486718 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fcf54cb6f-5r5xw"] Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.489929 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-586fc87d4f-7chp8"] Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.492858 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-586fc87d4f-7chp8"] Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.503972 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.578225 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-registry-certificates\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.578279 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-bound-sa-token\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.578301 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.578345 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-trusted-ca\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.578374 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbmm9\" (UniqueName: \"kubernetes.io/projected/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-kube-api-access-pbmm9\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.578403 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.578426 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-registry-tls\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.579237 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.579737 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-trusted-ca\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.580061 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-registry-certificates\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.590468 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-registry-tls\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.590484 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.604606 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-bound-sa-token\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.608613 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbmm9\" (UniqueName: \"kubernetes.io/projected/d43b96ff-4ae6-4e81-88b5-156a7c407a4e-kube-api-access-pbmm9\") pod \"image-registry-66df7c8f76-vqj4q\" (UID: \"d43b96ff-4ae6-4e81-88b5-156a7c407a4e\") " pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.683009 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.866237 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5fc8dfd9f9-jf7g8" Nov 29 14:34:05 crc kubenswrapper[4907]: I1129 14:34:05.965714 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-7874f76df5-4xxts"] Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.162804 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vqj4q"] Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.447883 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" event={"ID":"d43b96ff-4ae6-4e81-88b5-156a7c407a4e","Type":"ContainerStarted","Data":"ce37bcab44c040ede99786969741f14132bb026e8df7222fc3cefdaa8cfd45b8"} Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.448322 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" event={"ID":"d43b96ff-4ae6-4e81-88b5-156a7c407a4e","Type":"ContainerStarted","Data":"b40c9d0af2cae97cddd54880cb9e6d7e04e04636fd22929fc3fbd056f6750902"} Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.448343 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.463427 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" podStartSLOduration=1.4634160729999999 podStartE2EDuration="1.463416073s" podCreationTimestamp="2025-11-29 14:34:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:34:06.46332549 +0000 UTC m=+344.450163162" watchObservedRunningTime="2025-11-29 14:34:06.463416073 +0000 UTC m=+344.450253725" Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.486254 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3" path="/var/lib/kubelet/pods/42ed5aac-d0b1-4f5f-bcac-8eba46ee68c3/volumes" Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.487705 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b8c3a33-ee77-440d-a9ff-62b9741ead78" path="/var/lib/kubelet/pods/7b8c3a33-ee77-440d-a9ff-62b9741ead78/volumes" Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.908819 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-55b7d96bcc-xf29x"] Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.909918 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.913541 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.914196 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.915296 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.915641 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.919133 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.919239 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.925763 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55b7d96bcc-xf29x"] Nov 29 14:34:06 crc kubenswrapper[4907]: I1129 14:34:06.927531 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.105526 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7mxz\" (UniqueName: \"kubernetes.io/projected/76f8bdc8-dd3e-4f83-a7c5-83686f66709a-kube-api-access-p7mxz\") pod \"controller-manager-55b7d96bcc-xf29x\" (UID: \"76f8bdc8-dd3e-4f83-a7c5-83686f66709a\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.105628 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76f8bdc8-dd3e-4f83-a7c5-83686f66709a-serving-cert\") pod \"controller-manager-55b7d96bcc-xf29x\" (UID: \"76f8bdc8-dd3e-4f83-a7c5-83686f66709a\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.105670 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76f8bdc8-dd3e-4f83-a7c5-83686f66709a-proxy-ca-bundles\") pod \"controller-manager-55b7d96bcc-xf29x\" (UID: \"76f8bdc8-dd3e-4f83-a7c5-83686f66709a\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.106831 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76f8bdc8-dd3e-4f83-a7c5-83686f66709a-config\") pod \"controller-manager-55b7d96bcc-xf29x\" (UID: \"76f8bdc8-dd3e-4f83-a7c5-83686f66709a\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.106903 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76f8bdc8-dd3e-4f83-a7c5-83686f66709a-client-ca\") pod \"controller-manager-55b7d96bcc-xf29x\" (UID: \"76f8bdc8-dd3e-4f83-a7c5-83686f66709a\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.208050 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76f8bdc8-dd3e-4f83-a7c5-83686f66709a-config\") pod \"controller-manager-55b7d96bcc-xf29x\" (UID: \"76f8bdc8-dd3e-4f83-a7c5-83686f66709a\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.208570 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76f8bdc8-dd3e-4f83-a7c5-83686f66709a-client-ca\") pod \"controller-manager-55b7d96bcc-xf29x\" (UID: \"76f8bdc8-dd3e-4f83-a7c5-83686f66709a\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.208870 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7mxz\" (UniqueName: \"kubernetes.io/projected/76f8bdc8-dd3e-4f83-a7c5-83686f66709a-kube-api-access-p7mxz\") pod \"controller-manager-55b7d96bcc-xf29x\" (UID: \"76f8bdc8-dd3e-4f83-a7c5-83686f66709a\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.209143 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76f8bdc8-dd3e-4f83-a7c5-83686f66709a-serving-cert\") pod \"controller-manager-55b7d96bcc-xf29x\" (UID: \"76f8bdc8-dd3e-4f83-a7c5-83686f66709a\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.209375 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76f8bdc8-dd3e-4f83-a7c5-83686f66709a-proxy-ca-bundles\") pod \"controller-manager-55b7d96bcc-xf29x\" (UID: \"76f8bdc8-dd3e-4f83-a7c5-83686f66709a\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.209792 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76f8bdc8-dd3e-4f83-a7c5-83686f66709a-config\") pod \"controller-manager-55b7d96bcc-xf29x\" (UID: \"76f8bdc8-dd3e-4f83-a7c5-83686f66709a\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.210186 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76f8bdc8-dd3e-4f83-a7c5-83686f66709a-client-ca\") pod \"controller-manager-55b7d96bcc-xf29x\" (UID: \"76f8bdc8-dd3e-4f83-a7c5-83686f66709a\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.211412 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76f8bdc8-dd3e-4f83-a7c5-83686f66709a-proxy-ca-bundles\") pod \"controller-manager-55b7d96bcc-xf29x\" (UID: \"76f8bdc8-dd3e-4f83-a7c5-83686f66709a\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.219203 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76f8bdc8-dd3e-4f83-a7c5-83686f66709a-serving-cert\") pod \"controller-manager-55b7d96bcc-xf29x\" (UID: \"76f8bdc8-dd3e-4f83-a7c5-83686f66709a\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.240339 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7mxz\" (UniqueName: \"kubernetes.io/projected/76f8bdc8-dd3e-4f83-a7c5-83686f66709a-kube-api-access-p7mxz\") pod \"controller-manager-55b7d96bcc-xf29x\" (UID: \"76f8bdc8-dd3e-4f83-a7c5-83686f66709a\") " pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:07 crc kubenswrapper[4907]: I1129 14:34:07.536051 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:08 crc kubenswrapper[4907]: I1129 14:34:08.012013 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55b7d96bcc-xf29x"] Nov 29 14:34:08 crc kubenswrapper[4907]: I1129 14:34:08.471065 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" event={"ID":"76f8bdc8-dd3e-4f83-a7c5-83686f66709a","Type":"ContainerStarted","Data":"ca08834599f923645c290139a5fc025b2cadc188ca1b51c6119fb8c4e69e79a6"} Nov 29 14:34:08 crc kubenswrapper[4907]: I1129 14:34:08.471341 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:08 crc kubenswrapper[4907]: I1129 14:34:08.471355 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" event={"ID":"76f8bdc8-dd3e-4f83-a7c5-83686f66709a","Type":"ContainerStarted","Data":"ca5726d0d792ed103ef205bcc9170f4da5c06d79de2ffe9b3b1e97e1862b29da"} Nov 29 14:34:08 crc kubenswrapper[4907]: I1129 14:34:08.477915 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" Nov 29 14:34:08 crc kubenswrapper[4907]: I1129 14:34:08.492545 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-55b7d96bcc-xf29x" podStartSLOduration=5.492521795 podStartE2EDuration="5.492521795s" podCreationTimestamp="2025-11-29 14:34:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:34:08.489937094 +0000 UTC m=+346.476774756" watchObservedRunningTime="2025-11-29 14:34:08.492521795 +0000 UTC m=+346.479359457" Nov 29 14:34:25 crc kubenswrapper[4907]: I1129 14:34:25.695774 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-vqj4q" Nov 29 14:34:25 crc kubenswrapper[4907]: I1129 14:34:25.794007 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-56l8g"] Nov 29 14:34:28 crc kubenswrapper[4907]: I1129 14:34:28.489910 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:34:28 crc kubenswrapper[4907]: I1129 14:34:28.490391 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:34:30 crc kubenswrapper[4907]: I1129 14:34:30.993358 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" podUID="e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" containerName="oauth-openshift" containerID="cri-o://e97e70be3b54102a5a9eb94dcbab74c8a3a8b77e1e66a1ae11933001a68efeb6" gracePeriod=15 Nov 29 14:34:31 crc kubenswrapper[4907]: I1129 14:34:31.048604 4907 patch_prober.go:28] interesting pod/oauth-openshift-7874f76df5-4xxts container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" start-of-body= Nov 29 14:34:31 crc kubenswrapper[4907]: I1129 14:34:31.048754 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" podUID="e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" Nov 29 14:34:31 crc kubenswrapper[4907]: I1129 14:34:31.660843 4907 generic.go:334] "Generic (PLEG): container finished" podID="e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" containerID="e97e70be3b54102a5a9eb94dcbab74c8a3a8b77e1e66a1ae11933001a68efeb6" exitCode=0 Nov 29 14:34:31 crc kubenswrapper[4907]: I1129 14:34:31.661064 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" event={"ID":"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86","Type":"ContainerDied","Data":"e97e70be3b54102a5a9eb94dcbab74c8a3a8b77e1e66a1ae11933001a68efeb6"} Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.096077 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.148778 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-798864cdcb-8cxxm"] Nov 29 14:34:32 crc kubenswrapper[4907]: E1129 14:34:32.149796 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" containerName="oauth-openshift" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.149844 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" containerName="oauth-openshift" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.150393 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" containerName="oauth-openshift" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.152752 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.182544 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-798864cdcb-8cxxm"] Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.212249 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-ocp-branding-template\") pod \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.212317 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-audit-dir\") pod \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.212359 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-login\") pod \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.212386 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-trusted-ca-bundle\") pod \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.212410 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-error\") pod \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.212469 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-cliconfig\") pod \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.212553 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-session\") pod \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.212576 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-audit-policies\") pod \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.212598 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b926f\" (UniqueName: \"kubernetes.io/projected/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-kube-api-access-b926f\") pod \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.212625 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-service-ca\") pod \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.212663 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-router-certs\") pod \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.212750 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-idp-0-file-data\") pod \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.212825 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-provider-selection\") pod \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.212859 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-serving-cert\") pod \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\" (UID: \"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86\") " Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.215609 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" (UID: "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.216071 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" (UID: "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.216103 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" (UID: "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.216704 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" (UID: "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.216710 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" (UID: "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.219971 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" (UID: "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.220123 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" (UID: "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.220180 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" (UID: "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.220484 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-kube-api-access-b926f" (OuterVolumeSpecName: "kube-api-access-b926f") pod "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" (UID: "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86"). InnerVolumeSpecName "kube-api-access-b926f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.221060 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" (UID: "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.222211 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" (UID: "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.222625 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" (UID: "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.222707 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" (UID: "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.223637 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" (UID: "e1a4dc4f-ceaa-4e4b-826c-49c26c08be86"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.314944 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-user-template-login\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.315601 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.315789 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-router-certs\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.315913 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d22d9054-6784-43bc-b418-b8dc191b847f-audit-policies\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.316047 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m85kx\" (UniqueName: \"kubernetes.io/projected/d22d9054-6784-43bc-b418-b8dc191b847f-kube-api-access-m85kx\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.316177 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-session\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.316305 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-user-template-error\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.316425 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.316579 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d22d9054-6784-43bc-b418-b8dc191b847f-audit-dir\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.316707 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.317048 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.317184 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.317320 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-service-ca\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.317508 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.317675 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.317776 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.317870 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.317976 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.318117 4907 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.318219 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b926f\" (UniqueName: \"kubernetes.io/projected/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-kube-api-access-b926f\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.318316 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.318402 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.318516 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.318618 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.318705 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.318797 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.318883 4907 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.318977 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.420926 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-session\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.421009 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-user-template-error\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.421064 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.421112 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d22d9054-6784-43bc-b418-b8dc191b847f-audit-dir\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.421154 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.421226 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.421264 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.421308 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-service-ca\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.421345 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.421392 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-user-template-login\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.421466 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.421517 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-router-certs\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.421549 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d22d9054-6784-43bc-b418-b8dc191b847f-audit-policies\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.421588 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m85kx\" (UniqueName: \"kubernetes.io/projected/d22d9054-6784-43bc-b418-b8dc191b847f-kube-api-access-m85kx\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.422915 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d22d9054-6784-43bc-b418-b8dc191b847f-audit-dir\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.423584 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-service-ca\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.424600 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.424735 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d22d9054-6784-43bc-b418-b8dc191b847f-audit-policies\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.427605 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.427933 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.428404 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.428822 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-user-template-login\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.429498 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-user-template-error\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.429599 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-session\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.431096 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-router-certs\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.432318 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.435202 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d22d9054-6784-43bc-b418-b8dc191b847f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.455901 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m85kx\" (UniqueName: \"kubernetes.io/projected/d22d9054-6784-43bc-b418-b8dc191b847f-kube-api-access-m85kx\") pod \"oauth-openshift-798864cdcb-8cxxm\" (UID: \"d22d9054-6784-43bc-b418-b8dc191b847f\") " pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.486121 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.670517 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.670558 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7874f76df5-4xxts" event={"ID":"e1a4dc4f-ceaa-4e4b-826c-49c26c08be86","Type":"ContainerDied","Data":"7f07c0e3538593d917b939b3ebdcb0d93aae020ba5ea5b3a071ff0ed043ed6eb"} Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.670691 4907 scope.go:117] "RemoveContainer" containerID="e97e70be3b54102a5a9eb94dcbab74c8a3a8b77e1e66a1ae11933001a68efeb6" Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.709151 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-7874f76df5-4xxts"] Nov 29 14:34:32 crc kubenswrapper[4907]: I1129 14:34:32.715647 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-7874f76df5-4xxts"] Nov 29 14:34:33 crc kubenswrapper[4907]: I1129 14:34:33.005327 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-798864cdcb-8cxxm"] Nov 29 14:34:33 crc kubenswrapper[4907]: I1129 14:34:33.682524 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" event={"ID":"d22d9054-6784-43bc-b418-b8dc191b847f","Type":"ContainerStarted","Data":"6d4d61d420c53ec7896a14b1e262395c006ed52460d8924ea3450712028b36a6"} Nov 29 14:34:33 crc kubenswrapper[4907]: I1129 14:34:33.684089 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:33 crc kubenswrapper[4907]: I1129 14:34:33.684280 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" event={"ID":"d22d9054-6784-43bc-b418-b8dc191b847f","Type":"ContainerStarted","Data":"2a78befbcee1454ff5646bdd158532b051db859544b51c27fdd7232a36f0a256"} Nov 29 14:34:33 crc kubenswrapper[4907]: I1129 14:34:33.709203 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" podStartSLOduration=28.70918124 podStartE2EDuration="28.70918124s" podCreationTimestamp="2025-11-29 14:34:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:34:33.705850066 +0000 UTC m=+371.692687718" watchObservedRunningTime="2025-11-29 14:34:33.70918124 +0000 UTC m=+371.696018892" Nov 29 14:34:33 crc kubenswrapper[4907]: I1129 14:34:33.751237 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-798864cdcb-8cxxm" Nov 29 14:34:34 crc kubenswrapper[4907]: I1129 14:34:34.493837 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a4dc4f-ceaa-4e4b-826c-49c26c08be86" path="/var/lib/kubelet/pods/e1a4dc4f-ceaa-4e4b-826c-49c26c08be86/volumes" Nov 29 14:34:50 crc kubenswrapper[4907]: I1129 14:34:50.859599 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" podUID="e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c" containerName="registry" containerID="cri-o://22ced13bbac3b2ffbdb4d4b05b8f379e8d6341f81d878aaad6804c8b2f6c3297" gracePeriod=30 Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.369716 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.510793 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-registry-certificates\") pod \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.510893 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-ca-trust-extracted\") pod \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.511147 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.511274 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-installation-pull-secrets\") pod \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.511344 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-trusted-ca\") pod \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.511384 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-registry-tls\") pod \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.511476 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjdt6\" (UniqueName: \"kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-kube-api-access-gjdt6\") pod \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.511521 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-bound-sa-token\") pod \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\" (UID: \"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c\") " Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.512781 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.512857 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.520599 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-kube-api-access-gjdt6" (OuterVolumeSpecName: "kube-api-access-gjdt6") pod "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c"). InnerVolumeSpecName "kube-api-access-gjdt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.520627 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.521812 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.530592 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.531340 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.548412 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c" (UID: "e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.613887 4907 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.614488 4907 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.614527 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.614640 4907 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.614696 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjdt6\" (UniqueName: \"kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-kube-api-access-gjdt6\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.614727 4907 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.614756 4907 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.970583 4907 generic.go:334] "Generic (PLEG): container finished" podID="e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c" containerID="22ced13bbac3b2ffbdb4d4b05b8f379e8d6341f81d878aaad6804c8b2f6c3297" exitCode=0 Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.970658 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" event={"ID":"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c","Type":"ContainerDied","Data":"22ced13bbac3b2ffbdb4d4b05b8f379e8d6341f81d878aaad6804c8b2f6c3297"} Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.970738 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" event={"ID":"e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c","Type":"ContainerDied","Data":"ec91088d4c9418375c41e4709bb5f99d3de146cfbb57699dc859d0459e2ceae8"} Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.970770 4907 scope.go:117] "RemoveContainer" containerID="22ced13bbac3b2ffbdb4d4b05b8f379e8d6341f81d878aaad6804c8b2f6c3297" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.970675 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-56l8g" Nov 29 14:34:51 crc kubenswrapper[4907]: I1129 14:34:51.999806 4907 scope.go:117] "RemoveContainer" containerID="22ced13bbac3b2ffbdb4d4b05b8f379e8d6341f81d878aaad6804c8b2f6c3297" Nov 29 14:34:52 crc kubenswrapper[4907]: E1129 14:34:52.000397 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22ced13bbac3b2ffbdb4d4b05b8f379e8d6341f81d878aaad6804c8b2f6c3297\": container with ID starting with 22ced13bbac3b2ffbdb4d4b05b8f379e8d6341f81d878aaad6804c8b2f6c3297 not found: ID does not exist" containerID="22ced13bbac3b2ffbdb4d4b05b8f379e8d6341f81d878aaad6804c8b2f6c3297" Nov 29 14:34:52 crc kubenswrapper[4907]: I1129 14:34:52.000493 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ced13bbac3b2ffbdb4d4b05b8f379e8d6341f81d878aaad6804c8b2f6c3297"} err="failed to get container status \"22ced13bbac3b2ffbdb4d4b05b8f379e8d6341f81d878aaad6804c8b2f6c3297\": rpc error: code = NotFound desc = could not find container \"22ced13bbac3b2ffbdb4d4b05b8f379e8d6341f81d878aaad6804c8b2f6c3297\": container with ID starting with 22ced13bbac3b2ffbdb4d4b05b8f379e8d6341f81d878aaad6804c8b2f6c3297 not found: ID does not exist" Nov 29 14:34:52 crc kubenswrapper[4907]: I1129 14:34:52.021510 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-56l8g"] Nov 29 14:34:52 crc kubenswrapper[4907]: I1129 14:34:52.029329 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-56l8g"] Nov 29 14:34:52 crc kubenswrapper[4907]: I1129 14:34:52.490990 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c" path="/var/lib/kubelet/pods/e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c/volumes" Nov 29 14:34:58 crc kubenswrapper[4907]: I1129 14:34:58.490478 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:34:58 crc kubenswrapper[4907]: I1129 14:34:58.491001 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:34:59 crc kubenswrapper[4907]: I1129 14:34:59.844699 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g2mls"] Nov 29 14:34:59 crc kubenswrapper[4907]: I1129 14:34:59.846792 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g2mls" podUID="10f3989d-c7bb-4c4a-91e1-3b0afaedac98" containerName="registry-server" containerID="cri-o://0ad995ca53cf80aa49f06846b9b65f43df6b584652576b05a5ae26c2ac69ae8f" gracePeriod=30 Nov 29 14:34:59 crc kubenswrapper[4907]: I1129 14:34:59.862225 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tjtqd"] Nov 29 14:34:59 crc kubenswrapper[4907]: I1129 14:34:59.862460 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tjtqd" podUID="f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" containerName="registry-server" containerID="cri-o://618c711bbb2cb49384e0c0b40678b60b2e5d32e4ff47e43bff97dad1479cb56f" gracePeriod=30 Nov 29 14:34:59 crc kubenswrapper[4907]: I1129 14:34:59.874726 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtr2r"] Nov 29 14:34:59 crc kubenswrapper[4907]: I1129 14:34:59.875049 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" podUID="2c98c8e5-b9f1-43ca-93f6-cb74695dd076" containerName="marketplace-operator" containerID="cri-o://2963f2ecc0957dfa3dd57ccce4fe0af6dd55e43b657e7da6e48ea1470582aee4" gracePeriod=30 Nov 29 14:34:59 crc kubenswrapper[4907]: I1129 14:34:59.890063 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkhhq"] Nov 29 14:34:59 crc kubenswrapper[4907]: I1129 14:34:59.890616 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kkhhq" podUID="e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" containerName="registry-server" containerID="cri-o://e62a45ae586078083ec6fc30cdcd364d54c008d7312d7eb7da5bfbecf5d97b89" gracePeriod=30 Nov 29 14:34:59 crc kubenswrapper[4907]: I1129 14:34:59.894088 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ltxlt"] Nov 29 14:34:59 crc kubenswrapper[4907]: E1129 14:34:59.894370 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c" containerName="registry" Nov 29 14:34:59 crc kubenswrapper[4907]: I1129 14:34:59.894389 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c" containerName="registry" Nov 29 14:34:59 crc kubenswrapper[4907]: I1129 14:34:59.894523 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ae2318-a4a0-4a3f-8744-a0e42e5fa50c" containerName="registry" Nov 29 14:34:59 crc kubenswrapper[4907]: I1129 14:34:59.895028 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" Nov 29 14:34:59 crc kubenswrapper[4907]: I1129 14:34:59.912000 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-487xn"] Nov 29 14:34:59 crc kubenswrapper[4907]: I1129 14:34:59.912310 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-487xn" podUID="c6b02860-46c7-4498-a162-8e2833deb120" containerName="registry-server" containerID="cri-o://6f1500f01cf386bba81cf6b1299c09386cc15113e54e78d12cf9964497a7e3fc" gracePeriod=30 Nov 29 14:34:59 crc kubenswrapper[4907]: I1129 14:34:59.928661 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ltxlt"] Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.032158 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb4l2\" (UniqueName: \"kubernetes.io/projected/7964d25d-6ab7-44e4-9737-41d44ea2a311-kube-api-access-gb4l2\") pod \"marketplace-operator-79b997595-ltxlt\" (UID: \"7964d25d-6ab7-44e4-9737-41d44ea2a311\") " pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.032536 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7964d25d-6ab7-44e4-9737-41d44ea2a311-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ltxlt\" (UID: \"7964d25d-6ab7-44e4-9737-41d44ea2a311\") " pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.032568 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7964d25d-6ab7-44e4-9737-41d44ea2a311-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ltxlt\" (UID: \"7964d25d-6ab7-44e4-9737-41d44ea2a311\") " pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.034856 4907 generic.go:334] "Generic (PLEG): container finished" podID="10f3989d-c7bb-4c4a-91e1-3b0afaedac98" containerID="0ad995ca53cf80aa49f06846b9b65f43df6b584652576b05a5ae26c2ac69ae8f" exitCode=0 Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.034922 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2mls" event={"ID":"10f3989d-c7bb-4c4a-91e1-3b0afaedac98","Type":"ContainerDied","Data":"0ad995ca53cf80aa49f06846b9b65f43df6b584652576b05a5ae26c2ac69ae8f"} Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.040368 4907 generic.go:334] "Generic (PLEG): container finished" podID="f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" containerID="618c711bbb2cb49384e0c0b40678b60b2e5d32e4ff47e43bff97dad1479cb56f" exitCode=0 Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.040466 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjtqd" event={"ID":"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74","Type":"ContainerDied","Data":"618c711bbb2cb49384e0c0b40678b60b2e5d32e4ff47e43bff97dad1479cb56f"} Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.048942 4907 generic.go:334] "Generic (PLEG): container finished" podID="2c98c8e5-b9f1-43ca-93f6-cb74695dd076" containerID="2963f2ecc0957dfa3dd57ccce4fe0af6dd55e43b657e7da6e48ea1470582aee4" exitCode=0 Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.048998 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" event={"ID":"2c98c8e5-b9f1-43ca-93f6-cb74695dd076","Type":"ContainerDied","Data":"2963f2ecc0957dfa3dd57ccce4fe0af6dd55e43b657e7da6e48ea1470582aee4"} Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.053568 4907 generic.go:334] "Generic (PLEG): container finished" podID="e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" containerID="e62a45ae586078083ec6fc30cdcd364d54c008d7312d7eb7da5bfbecf5d97b89" exitCode=0 Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.053604 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkhhq" event={"ID":"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337","Type":"ContainerDied","Data":"e62a45ae586078083ec6fc30cdcd364d54c008d7312d7eb7da5bfbecf5d97b89"} Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.133545 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb4l2\" (UniqueName: \"kubernetes.io/projected/7964d25d-6ab7-44e4-9737-41d44ea2a311-kube-api-access-gb4l2\") pod \"marketplace-operator-79b997595-ltxlt\" (UID: \"7964d25d-6ab7-44e4-9737-41d44ea2a311\") " pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.133606 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7964d25d-6ab7-44e4-9737-41d44ea2a311-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ltxlt\" (UID: \"7964d25d-6ab7-44e4-9737-41d44ea2a311\") " pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.133636 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7964d25d-6ab7-44e4-9737-41d44ea2a311-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ltxlt\" (UID: \"7964d25d-6ab7-44e4-9737-41d44ea2a311\") " pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.135033 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7964d25d-6ab7-44e4-9737-41d44ea2a311-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ltxlt\" (UID: \"7964d25d-6ab7-44e4-9737-41d44ea2a311\") " pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.146246 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7964d25d-6ab7-44e4-9737-41d44ea2a311-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ltxlt\" (UID: \"7964d25d-6ab7-44e4-9737-41d44ea2a311\") " pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.157474 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb4l2\" (UniqueName: \"kubernetes.io/projected/7964d25d-6ab7-44e4-9737-41d44ea2a311-kube-api-access-gb4l2\") pod \"marketplace-operator-79b997595-ltxlt\" (UID: \"7964d25d-6ab7-44e4-9737-41d44ea2a311\") " pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.271599 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.319409 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.409647 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.418728 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.441604 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.450696 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-catalog-content\") pod \"10f3989d-c7bb-4c4a-91e1-3b0afaedac98\" (UID: \"10f3989d-c7bb-4c4a-91e1-3b0afaedac98\") " Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.451425 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-utilities\") pod \"10f3989d-c7bb-4c4a-91e1-3b0afaedac98\" (UID: \"10f3989d-c7bb-4c4a-91e1-3b0afaedac98\") " Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.451777 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-utilities" (OuterVolumeSpecName: "utilities") pod "10f3989d-c7bb-4c4a-91e1-3b0afaedac98" (UID: "10f3989d-c7bb-4c4a-91e1-3b0afaedac98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.453166 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd4kk\" (UniqueName: \"kubernetes.io/projected/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-kube-api-access-dd4kk\") pod \"10f3989d-c7bb-4c4a-91e1-3b0afaedac98\" (UID: \"10f3989d-c7bb-4c4a-91e1-3b0afaedac98\") " Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.455655 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-kube-api-access-dd4kk" (OuterVolumeSpecName: "kube-api-access-dd4kk") pod "10f3989d-c7bb-4c4a-91e1-3b0afaedac98" (UID: "10f3989d-c7bb-4c4a-91e1-3b0afaedac98"). InnerVolumeSpecName "kube-api-access-dd4kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.456310 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.456324 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd4kk\" (UniqueName: \"kubernetes.io/projected/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-kube-api-access-dd4kk\") on node \"crc\" DevicePath \"\"" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.459820 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.514877 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10f3989d-c7bb-4c4a-91e1-3b0afaedac98" (UID: "10f3989d-c7bb-4c4a-91e1-3b0afaedac98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.556936 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-marketplace-operator-metrics\") pod \"2c98c8e5-b9f1-43ca-93f6-cb74695dd076\" (UID: \"2c98c8e5-b9f1-43ca-93f6-cb74695dd076\") " Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.556983 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-utilities\") pod \"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337\" (UID: \"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337\") " Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.557031 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6b02860-46c7-4498-a162-8e2833deb120-utilities\") pod \"c6b02860-46c7-4498-a162-8e2833deb120\" (UID: \"c6b02860-46c7-4498-a162-8e2833deb120\") " Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.557076 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-marketplace-trusted-ca\") pod \"2c98c8e5-b9f1-43ca-93f6-cb74695dd076\" (UID: \"2c98c8e5-b9f1-43ca-93f6-cb74695dd076\") " Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.557383 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6b02860-46c7-4498-a162-8e2833deb120-catalog-content\") pod \"c6b02860-46c7-4498-a162-8e2833deb120\" (UID: \"c6b02860-46c7-4498-a162-8e2833deb120\") " Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.557417 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-catalog-content\") pod \"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74\" (UID: \"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74\") " Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.557457 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99mgl\" (UniqueName: \"kubernetes.io/projected/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-kube-api-access-99mgl\") pod \"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74\" (UID: \"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74\") " Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.557480 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwbrj\" (UniqueName: \"kubernetes.io/projected/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-kube-api-access-pwbrj\") pod \"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337\" (UID: \"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337\") " Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.557507 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-catalog-content\") pod \"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337\" (UID: \"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337\") " Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.557529 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns79s\" (UniqueName: \"kubernetes.io/projected/c6b02860-46c7-4498-a162-8e2833deb120-kube-api-access-ns79s\") pod \"c6b02860-46c7-4498-a162-8e2833deb120\" (UID: \"c6b02860-46c7-4498-a162-8e2833deb120\") " Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.557547 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rmkd\" (UniqueName: \"kubernetes.io/projected/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-kube-api-access-2rmkd\") pod \"2c98c8e5-b9f1-43ca-93f6-cb74695dd076\" (UID: \"2c98c8e5-b9f1-43ca-93f6-cb74695dd076\") " Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.557747 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10f3989d-c7bb-4c4a-91e1-3b0afaedac98-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.558007 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6b02860-46c7-4498-a162-8e2833deb120-utilities" (OuterVolumeSpecName: "utilities") pod "c6b02860-46c7-4498-a162-8e2833deb120" (UID: "c6b02860-46c7-4498-a162-8e2833deb120"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.560293 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-kube-api-access-2rmkd" (OuterVolumeSpecName: "kube-api-access-2rmkd") pod "2c98c8e5-b9f1-43ca-93f6-cb74695dd076" (UID: "2c98c8e5-b9f1-43ca-93f6-cb74695dd076"). InnerVolumeSpecName "kube-api-access-2rmkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.560502 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "2c98c8e5-b9f1-43ca-93f6-cb74695dd076" (UID: "2c98c8e5-b9f1-43ca-93f6-cb74695dd076"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.560645 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-kube-api-access-pwbrj" (OuterVolumeSpecName: "kube-api-access-pwbrj") pod "e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" (UID: "e2b415cf-bdf4-4f9b-8ce8-a75b1a026337"). InnerVolumeSpecName "kube-api-access-pwbrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.560833 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-kube-api-access-99mgl" (OuterVolumeSpecName: "kube-api-access-99mgl") pod "f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" (UID: "f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74"). InnerVolumeSpecName "kube-api-access-99mgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.562127 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6b02860-46c7-4498-a162-8e2833deb120-kube-api-access-ns79s" (OuterVolumeSpecName: "kube-api-access-ns79s") pod "c6b02860-46c7-4498-a162-8e2833deb120" (UID: "c6b02860-46c7-4498-a162-8e2833deb120"). InnerVolumeSpecName "kube-api-access-ns79s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.565787 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-utilities" (OuterVolumeSpecName: "utilities") pod "e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" (UID: "e2b415cf-bdf4-4f9b-8ce8-a75b1a026337"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.566506 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "2c98c8e5-b9f1-43ca-93f6-cb74695dd076" (UID: "2c98c8e5-b9f1-43ca-93f6-cb74695dd076"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.579237 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" (UID: "e2b415cf-bdf4-4f9b-8ce8-a75b1a026337"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.616245 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" (UID: "f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.658641 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-utilities\") pod \"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74\" (UID: \"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74\") " Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.659135 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99mgl\" (UniqueName: \"kubernetes.io/projected/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-kube-api-access-99mgl\") on node \"crc\" DevicePath \"\"" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.659161 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwbrj\" (UniqueName: \"kubernetes.io/projected/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-kube-api-access-pwbrj\") on node \"crc\" DevicePath \"\"" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.659176 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.659192 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns79s\" (UniqueName: \"kubernetes.io/projected/c6b02860-46c7-4498-a162-8e2833deb120-kube-api-access-ns79s\") on node \"crc\" DevicePath \"\"" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.659206 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rmkd\" (UniqueName: \"kubernetes.io/projected/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-kube-api-access-2rmkd\") on node \"crc\" DevicePath \"\"" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.659225 4907 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.659241 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.659253 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6b02860-46c7-4498-a162-8e2833deb120-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.659267 4907 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c98c8e5-b9f1-43ca-93f6-cb74695dd076-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.659280 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.660771 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-utilities" (OuterVolumeSpecName: "utilities") pod "f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" (UID: "f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.661825 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6b02860-46c7-4498-a162-8e2833deb120-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6b02860-46c7-4498-a162-8e2833deb120" (UID: "c6b02860-46c7-4498-a162-8e2833deb120"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.760391 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.760467 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6b02860-46c7-4498-a162-8e2833deb120-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:35:00 crc kubenswrapper[4907]: I1129 14:35:00.766461 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ltxlt"] Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.066122 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" event={"ID":"2c98c8e5-b9f1-43ca-93f6-cb74695dd076","Type":"ContainerDied","Data":"cd5127b44244351f1b7c47a819489bcd00fe63db5cc51afe60565c3ca72163a8"} Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.066199 4907 scope.go:117] "RemoveContainer" containerID="2963f2ecc0957dfa3dd57ccce4fe0af6dd55e43b657e7da6e48ea1470582aee4" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.066383 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qtr2r" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.071568 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkhhq" event={"ID":"e2b415cf-bdf4-4f9b-8ce8-a75b1a026337","Type":"ContainerDied","Data":"3dfc4ce91b8faa153a6bbc939a698a4b9ca7f89f7dfbe817eeb70ff0a614f86a"} Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.071626 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkhhq" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.076228 4907 generic.go:334] "Generic (PLEG): container finished" podID="c6b02860-46c7-4498-a162-8e2833deb120" containerID="6f1500f01cf386bba81cf6b1299c09386cc15113e54e78d12cf9964497a7e3fc" exitCode=0 Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.076293 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-487xn" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.076299 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-487xn" event={"ID":"c6b02860-46c7-4498-a162-8e2833deb120","Type":"ContainerDied","Data":"6f1500f01cf386bba81cf6b1299c09386cc15113e54e78d12cf9964497a7e3fc"} Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.076379 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-487xn" event={"ID":"c6b02860-46c7-4498-a162-8e2833deb120","Type":"ContainerDied","Data":"59f50a6032f0d79cf5621464cff6d6d30e6ea33f419263c5e96365a9e1bb934d"} Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.085367 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2mls" event={"ID":"10f3989d-c7bb-4c4a-91e1-3b0afaedac98","Type":"ContainerDied","Data":"aa73259d33c33175c9cb1968385086fa74e11d3312b48830b6771da48e02aa5e"} Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.085384 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g2mls" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.087794 4907 scope.go:117] "RemoveContainer" containerID="e62a45ae586078083ec6fc30cdcd364d54c008d7312d7eb7da5bfbecf5d97b89" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.089838 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" event={"ID":"7964d25d-6ab7-44e4-9737-41d44ea2a311","Type":"ContainerStarted","Data":"587d060ee92eceb48461f1582f2dbfecbd57185798939e34312b7ebfb647c9b9"} Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.089883 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" event={"ID":"7964d25d-6ab7-44e4-9737-41d44ea2a311","Type":"ContainerStarted","Data":"25a594d17730de2dee1f6fff156f4e9c1e6d5fffcb73632e96842b92b4fe13a3"} Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.090075 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.093854 4907 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ltxlt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" start-of-body= Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.093928 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" podUID="7964d25d-6ab7-44e4-9737-41d44ea2a311" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.098038 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjtqd" event={"ID":"f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74","Type":"ContainerDied","Data":"299f55338ea93b3549a5faa5b765abffd36116e1f1b1774695cb40b0db6811a8"} Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.098085 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjtqd" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.111364 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" podStartSLOduration=2.111349787 podStartE2EDuration="2.111349787s" podCreationTimestamp="2025-11-29 14:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:35:01.109257007 +0000 UTC m=+399.096094699" watchObservedRunningTime="2025-11-29 14:35:01.111349787 +0000 UTC m=+399.098187439" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.122989 4907 scope.go:117] "RemoveContainer" containerID="41ddbae1c1da75b1d93a17336cd6ef88c11ebf045f6fb2fa7d07d244fdf3746f" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.137502 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkhhq"] Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.141258 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkhhq"] Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.152298 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g2mls"] Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.155109 4907 scope.go:117] "RemoveContainer" containerID="74dd0fce1fd2e279d775da1cd97cc604577ed8fa5cde0cfbd7c92d11809380c2" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.161393 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g2mls"] Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.168943 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tjtqd"] Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.175494 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tjtqd"] Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.180328 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtr2r"] Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.182676 4907 scope.go:117] "RemoveContainer" containerID="6f1500f01cf386bba81cf6b1299c09386cc15113e54e78d12cf9964497a7e3fc" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.185508 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qtr2r"] Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.189535 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-487xn"] Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.191616 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-487xn"] Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.193870 4907 scope.go:117] "RemoveContainer" containerID="6703a43a7fee3c9cfc67b8051cb40787ae580513b9d59a6a00ed61f0ce0445f9" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.211525 4907 scope.go:117] "RemoveContainer" containerID="6d4a6a1a8d5385377b1fa061b394b348bc7def23b211c6fb76ad8d0e0fa6bb53" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.225426 4907 scope.go:117] "RemoveContainer" containerID="6f1500f01cf386bba81cf6b1299c09386cc15113e54e78d12cf9964497a7e3fc" Nov 29 14:35:01 crc kubenswrapper[4907]: E1129 14:35:01.225927 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f1500f01cf386bba81cf6b1299c09386cc15113e54e78d12cf9964497a7e3fc\": container with ID starting with 6f1500f01cf386bba81cf6b1299c09386cc15113e54e78d12cf9964497a7e3fc not found: ID does not exist" containerID="6f1500f01cf386bba81cf6b1299c09386cc15113e54e78d12cf9964497a7e3fc" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.225997 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f1500f01cf386bba81cf6b1299c09386cc15113e54e78d12cf9964497a7e3fc"} err="failed to get container status \"6f1500f01cf386bba81cf6b1299c09386cc15113e54e78d12cf9964497a7e3fc\": rpc error: code = NotFound desc = could not find container \"6f1500f01cf386bba81cf6b1299c09386cc15113e54e78d12cf9964497a7e3fc\": container with ID starting with 6f1500f01cf386bba81cf6b1299c09386cc15113e54e78d12cf9964497a7e3fc not found: ID does not exist" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.226029 4907 scope.go:117] "RemoveContainer" containerID="6703a43a7fee3c9cfc67b8051cb40787ae580513b9d59a6a00ed61f0ce0445f9" Nov 29 14:35:01 crc kubenswrapper[4907]: E1129 14:35:01.226422 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6703a43a7fee3c9cfc67b8051cb40787ae580513b9d59a6a00ed61f0ce0445f9\": container with ID starting with 6703a43a7fee3c9cfc67b8051cb40787ae580513b9d59a6a00ed61f0ce0445f9 not found: ID does not exist" containerID="6703a43a7fee3c9cfc67b8051cb40787ae580513b9d59a6a00ed61f0ce0445f9" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.226471 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6703a43a7fee3c9cfc67b8051cb40787ae580513b9d59a6a00ed61f0ce0445f9"} err="failed to get container status \"6703a43a7fee3c9cfc67b8051cb40787ae580513b9d59a6a00ed61f0ce0445f9\": rpc error: code = NotFound desc = could not find container \"6703a43a7fee3c9cfc67b8051cb40787ae580513b9d59a6a00ed61f0ce0445f9\": container with ID starting with 6703a43a7fee3c9cfc67b8051cb40787ae580513b9d59a6a00ed61f0ce0445f9 not found: ID does not exist" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.226490 4907 scope.go:117] "RemoveContainer" containerID="6d4a6a1a8d5385377b1fa061b394b348bc7def23b211c6fb76ad8d0e0fa6bb53" Nov 29 14:35:01 crc kubenswrapper[4907]: E1129 14:35:01.226948 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4a6a1a8d5385377b1fa061b394b348bc7def23b211c6fb76ad8d0e0fa6bb53\": container with ID starting with 6d4a6a1a8d5385377b1fa061b394b348bc7def23b211c6fb76ad8d0e0fa6bb53 not found: ID does not exist" containerID="6d4a6a1a8d5385377b1fa061b394b348bc7def23b211c6fb76ad8d0e0fa6bb53" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.226983 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4a6a1a8d5385377b1fa061b394b348bc7def23b211c6fb76ad8d0e0fa6bb53"} err="failed to get container status \"6d4a6a1a8d5385377b1fa061b394b348bc7def23b211c6fb76ad8d0e0fa6bb53\": rpc error: code = NotFound desc = could not find container \"6d4a6a1a8d5385377b1fa061b394b348bc7def23b211c6fb76ad8d0e0fa6bb53\": container with ID starting with 6d4a6a1a8d5385377b1fa061b394b348bc7def23b211c6fb76ad8d0e0fa6bb53 not found: ID does not exist" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.227009 4907 scope.go:117] "RemoveContainer" containerID="0ad995ca53cf80aa49f06846b9b65f43df6b584652576b05a5ae26c2ac69ae8f" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.241150 4907 scope.go:117] "RemoveContainer" containerID="1f8e1a520036ab4af7df334546855c04cdc64b253dbbc502ea0ddf054383f0d3" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.254336 4907 scope.go:117] "RemoveContainer" containerID="5884aa8a804c39334fa05a337ac2f88d91216c92d698904db83e99644bb6a880" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.269609 4907 scope.go:117] "RemoveContainer" containerID="618c711bbb2cb49384e0c0b40678b60b2e5d32e4ff47e43bff97dad1479cb56f" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.281032 4907 scope.go:117] "RemoveContainer" containerID="9471866d2d5c1f4231f96f822a0a096b5b92fd21a804ac8b60ee08a2a9740b47" Nov 29 14:35:01 crc kubenswrapper[4907]: I1129 14:35:01.295515 4907 scope.go:117] "RemoveContainer" containerID="268edbeeb474a99a59f4787a16f55b2880316e233d6239150520554e06b79377" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.076824 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-95kn8"] Nov 29 14:35:02 crc kubenswrapper[4907]: E1129 14:35:02.077075 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f3989d-c7bb-4c4a-91e1-3b0afaedac98" containerName="registry-server" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077088 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f3989d-c7bb-4c4a-91e1-3b0afaedac98" containerName="registry-server" Nov 29 14:35:02 crc kubenswrapper[4907]: E1129 14:35:02.077102 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b02860-46c7-4498-a162-8e2833deb120" containerName="registry-server" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077109 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b02860-46c7-4498-a162-8e2833deb120" containerName="registry-server" Nov 29 14:35:02 crc kubenswrapper[4907]: E1129 14:35:02.077119 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f3989d-c7bb-4c4a-91e1-3b0afaedac98" containerName="extract-utilities" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077127 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f3989d-c7bb-4c4a-91e1-3b0afaedac98" containerName="extract-utilities" Nov 29 14:35:02 crc kubenswrapper[4907]: E1129 14:35:02.077136 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" containerName="registry-server" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077144 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" containerName="registry-server" Nov 29 14:35:02 crc kubenswrapper[4907]: E1129 14:35:02.077151 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c98c8e5-b9f1-43ca-93f6-cb74695dd076" containerName="marketplace-operator" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077156 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c98c8e5-b9f1-43ca-93f6-cb74695dd076" containerName="marketplace-operator" Nov 29 14:35:02 crc kubenswrapper[4907]: E1129 14:35:02.077163 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b02860-46c7-4498-a162-8e2833deb120" containerName="extract-utilities" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077169 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b02860-46c7-4498-a162-8e2833deb120" containerName="extract-utilities" Nov 29 14:35:02 crc kubenswrapper[4907]: E1129 14:35:02.077178 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" containerName="extract-content" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077185 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" containerName="extract-content" Nov 29 14:35:02 crc kubenswrapper[4907]: E1129 14:35:02.077191 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" containerName="extract-content" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077197 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" containerName="extract-content" Nov 29 14:35:02 crc kubenswrapper[4907]: E1129 14:35:02.077207 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" containerName="extract-utilities" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077213 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" containerName="extract-utilities" Nov 29 14:35:02 crc kubenswrapper[4907]: E1129 14:35:02.077222 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f3989d-c7bb-4c4a-91e1-3b0afaedac98" containerName="extract-content" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077228 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f3989d-c7bb-4c4a-91e1-3b0afaedac98" containerName="extract-content" Nov 29 14:35:02 crc kubenswrapper[4907]: E1129 14:35:02.077238 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" containerName="registry-server" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077244 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" containerName="registry-server" Nov 29 14:35:02 crc kubenswrapper[4907]: E1129 14:35:02.077255 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b02860-46c7-4498-a162-8e2833deb120" containerName="extract-content" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077261 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b02860-46c7-4498-a162-8e2833deb120" containerName="extract-content" Nov 29 14:35:02 crc kubenswrapper[4907]: E1129 14:35:02.077268 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" containerName="extract-utilities" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077273 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" containerName="extract-utilities" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077368 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" containerName="registry-server" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077378 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" containerName="registry-server" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077385 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="10f3989d-c7bb-4c4a-91e1-3b0afaedac98" containerName="registry-server" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077398 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c98c8e5-b9f1-43ca-93f6-cb74695dd076" containerName="marketplace-operator" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.077407 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6b02860-46c7-4498-a162-8e2833deb120" containerName="registry-server" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.078112 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-95kn8" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.080149 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5171ef24-3274-44d5-8d36-8d5be3534c2a-utilities\") pod \"redhat-marketplace-95kn8\" (UID: \"5171ef24-3274-44d5-8d36-8d5be3534c2a\") " pod="openshift-marketplace/redhat-marketplace-95kn8" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.080247 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5171ef24-3274-44d5-8d36-8d5be3534c2a-catalog-content\") pod \"redhat-marketplace-95kn8\" (UID: \"5171ef24-3274-44d5-8d36-8d5be3534c2a\") " pod="openshift-marketplace/redhat-marketplace-95kn8" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.080298 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdjkd\" (UniqueName: \"kubernetes.io/projected/5171ef24-3274-44d5-8d36-8d5be3534c2a-kube-api-access-mdjkd\") pod \"redhat-marketplace-95kn8\" (UID: \"5171ef24-3274-44d5-8d36-8d5be3534c2a\") " pod="openshift-marketplace/redhat-marketplace-95kn8" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.086062 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.086975 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-95kn8"] Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.139757 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ltxlt" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.181547 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdjkd\" (UniqueName: \"kubernetes.io/projected/5171ef24-3274-44d5-8d36-8d5be3534c2a-kube-api-access-mdjkd\") pod \"redhat-marketplace-95kn8\" (UID: \"5171ef24-3274-44d5-8d36-8d5be3534c2a\") " pod="openshift-marketplace/redhat-marketplace-95kn8" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.181690 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5171ef24-3274-44d5-8d36-8d5be3534c2a-utilities\") pod \"redhat-marketplace-95kn8\" (UID: \"5171ef24-3274-44d5-8d36-8d5be3534c2a\") " pod="openshift-marketplace/redhat-marketplace-95kn8" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.181759 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5171ef24-3274-44d5-8d36-8d5be3534c2a-catalog-content\") pod \"redhat-marketplace-95kn8\" (UID: \"5171ef24-3274-44d5-8d36-8d5be3534c2a\") " pod="openshift-marketplace/redhat-marketplace-95kn8" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.182373 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5171ef24-3274-44d5-8d36-8d5be3534c2a-catalog-content\") pod \"redhat-marketplace-95kn8\" (UID: \"5171ef24-3274-44d5-8d36-8d5be3534c2a\") " pod="openshift-marketplace/redhat-marketplace-95kn8" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.184173 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5171ef24-3274-44d5-8d36-8d5be3534c2a-utilities\") pod \"redhat-marketplace-95kn8\" (UID: \"5171ef24-3274-44d5-8d36-8d5be3534c2a\") " pod="openshift-marketplace/redhat-marketplace-95kn8" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.213399 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdjkd\" (UniqueName: \"kubernetes.io/projected/5171ef24-3274-44d5-8d36-8d5be3534c2a-kube-api-access-mdjkd\") pod \"redhat-marketplace-95kn8\" (UID: \"5171ef24-3274-44d5-8d36-8d5be3534c2a\") " pod="openshift-marketplace/redhat-marketplace-95kn8" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.278813 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-58pkc"] Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.280649 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-58pkc" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.282871 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7315fc63-0710-4bcc-a67a-6c2c649192d0-catalog-content\") pod \"redhat-operators-58pkc\" (UID: \"7315fc63-0710-4bcc-a67a-6c2c649192d0\") " pod="openshift-marketplace/redhat-operators-58pkc" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.282904 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nc4x\" (UniqueName: \"kubernetes.io/projected/7315fc63-0710-4bcc-a67a-6c2c649192d0-kube-api-access-4nc4x\") pod \"redhat-operators-58pkc\" (UID: \"7315fc63-0710-4bcc-a67a-6c2c649192d0\") " pod="openshift-marketplace/redhat-operators-58pkc" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.282931 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7315fc63-0710-4bcc-a67a-6c2c649192d0-utilities\") pod \"redhat-operators-58pkc\" (UID: \"7315fc63-0710-4bcc-a67a-6c2c649192d0\") " pod="openshift-marketplace/redhat-operators-58pkc" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.283364 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.289631 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-58pkc"] Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.385100 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7315fc63-0710-4bcc-a67a-6c2c649192d0-catalog-content\") pod \"redhat-operators-58pkc\" (UID: \"7315fc63-0710-4bcc-a67a-6c2c649192d0\") " pod="openshift-marketplace/redhat-operators-58pkc" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.385231 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nc4x\" (UniqueName: \"kubernetes.io/projected/7315fc63-0710-4bcc-a67a-6c2c649192d0-kube-api-access-4nc4x\") pod \"redhat-operators-58pkc\" (UID: \"7315fc63-0710-4bcc-a67a-6c2c649192d0\") " pod="openshift-marketplace/redhat-operators-58pkc" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.385293 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7315fc63-0710-4bcc-a67a-6c2c649192d0-utilities\") pod \"redhat-operators-58pkc\" (UID: \"7315fc63-0710-4bcc-a67a-6c2c649192d0\") " pod="openshift-marketplace/redhat-operators-58pkc" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.385811 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7315fc63-0710-4bcc-a67a-6c2c649192d0-catalog-content\") pod \"redhat-operators-58pkc\" (UID: \"7315fc63-0710-4bcc-a67a-6c2c649192d0\") " pod="openshift-marketplace/redhat-operators-58pkc" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.386202 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7315fc63-0710-4bcc-a67a-6c2c649192d0-utilities\") pod \"redhat-operators-58pkc\" (UID: \"7315fc63-0710-4bcc-a67a-6c2c649192d0\") " pod="openshift-marketplace/redhat-operators-58pkc" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.396727 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-95kn8" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.403807 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nc4x\" (UniqueName: \"kubernetes.io/projected/7315fc63-0710-4bcc-a67a-6c2c649192d0-kube-api-access-4nc4x\") pod \"redhat-operators-58pkc\" (UID: \"7315fc63-0710-4bcc-a67a-6c2c649192d0\") " pod="openshift-marketplace/redhat-operators-58pkc" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.496265 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10f3989d-c7bb-4c4a-91e1-3b0afaedac98" path="/var/lib/kubelet/pods/10f3989d-c7bb-4c4a-91e1-3b0afaedac98/volumes" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.497311 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c98c8e5-b9f1-43ca-93f6-cb74695dd076" path="/var/lib/kubelet/pods/2c98c8e5-b9f1-43ca-93f6-cb74695dd076/volumes" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.497888 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6b02860-46c7-4498-a162-8e2833deb120" path="/var/lib/kubelet/pods/c6b02860-46c7-4498-a162-8e2833deb120/volumes" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.499020 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2b415cf-bdf4-4f9b-8ce8-a75b1a026337" path="/var/lib/kubelet/pods/e2b415cf-bdf4-4f9b-8ce8-a75b1a026337/volumes" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.499657 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74" path="/var/lib/kubelet/pods/f84e5ca4-0f01-4580-a0e1-a8ecfd57ed74/volumes" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.598989 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-58pkc" Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.789479 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-95kn8"] Nov 29 14:35:02 crc kubenswrapper[4907]: W1129 14:35:02.796971 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5171ef24_3274_44d5_8d36_8d5be3534c2a.slice/crio-73d7a6c644473ac3fbec7476ba2cf055e62c6a70461b9da86509e26d1c952159 WatchSource:0}: Error finding container 73d7a6c644473ac3fbec7476ba2cf055e62c6a70461b9da86509e26d1c952159: Status 404 returned error can't find the container with id 73d7a6c644473ac3fbec7476ba2cf055e62c6a70461b9da86509e26d1c952159 Nov 29 14:35:02 crc kubenswrapper[4907]: I1129 14:35:02.987914 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-58pkc"] Nov 29 14:35:03 crc kubenswrapper[4907]: I1129 14:35:03.145015 4907 generic.go:334] "Generic (PLEG): container finished" podID="5171ef24-3274-44d5-8d36-8d5be3534c2a" containerID="5175df7a246fa517e59c97f8ed2afa3bce1bbcfe2df8535f3f6585833c52b574" exitCode=0 Nov 29 14:35:03 crc kubenswrapper[4907]: I1129 14:35:03.145172 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95kn8" event={"ID":"5171ef24-3274-44d5-8d36-8d5be3534c2a","Type":"ContainerDied","Data":"5175df7a246fa517e59c97f8ed2afa3bce1bbcfe2df8535f3f6585833c52b574"} Nov 29 14:35:03 crc kubenswrapper[4907]: I1129 14:35:03.145247 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95kn8" event={"ID":"5171ef24-3274-44d5-8d36-8d5be3534c2a","Type":"ContainerStarted","Data":"73d7a6c644473ac3fbec7476ba2cf055e62c6a70461b9da86509e26d1c952159"} Nov 29 14:35:03 crc kubenswrapper[4907]: I1129 14:35:03.151392 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58pkc" event={"ID":"7315fc63-0710-4bcc-a67a-6c2c649192d0","Type":"ContainerStarted","Data":"525b37c0a947dcb19192c1ff675a3ffaa8ff1d4307d049a4979d3e455fdb6f85"} Nov 29 14:35:03 crc kubenswrapper[4907]: I1129 14:35:03.151455 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58pkc" event={"ID":"7315fc63-0710-4bcc-a67a-6c2c649192d0","Type":"ContainerStarted","Data":"88eca4c0ca738ae1728e6d6d074eb0ddcbe37b50e70269d0a108c71740c14c1f"} Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.158126 4907 generic.go:334] "Generic (PLEG): container finished" podID="7315fc63-0710-4bcc-a67a-6c2c649192d0" containerID="525b37c0a947dcb19192c1ff675a3ffaa8ff1d4307d049a4979d3e455fdb6f85" exitCode=0 Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.158167 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58pkc" event={"ID":"7315fc63-0710-4bcc-a67a-6c2c649192d0","Type":"ContainerDied","Data":"525b37c0a947dcb19192c1ff675a3ffaa8ff1d4307d049a4979d3e455fdb6f85"} Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.493721 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bzmdm"] Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.495813 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bzmdm"] Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.495953 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bzmdm" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.499401 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.616328 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ce243ff-d352-42f5-82b7-57f145c149c9-utilities\") pod \"certified-operators-bzmdm\" (UID: \"7ce243ff-d352-42f5-82b7-57f145c149c9\") " pod="openshift-marketplace/certified-operators-bzmdm" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.616943 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzbfn\" (UniqueName: \"kubernetes.io/projected/7ce243ff-d352-42f5-82b7-57f145c149c9-kube-api-access-xzbfn\") pod \"certified-operators-bzmdm\" (UID: \"7ce243ff-d352-42f5-82b7-57f145c149c9\") " pod="openshift-marketplace/certified-operators-bzmdm" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.617239 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ce243ff-d352-42f5-82b7-57f145c149c9-catalog-content\") pod \"certified-operators-bzmdm\" (UID: \"7ce243ff-d352-42f5-82b7-57f145c149c9\") " pod="openshift-marketplace/certified-operators-bzmdm" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.678724 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m2p45"] Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.679882 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2p45" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.682137 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.686902 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m2p45"] Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.718465 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ce243ff-d352-42f5-82b7-57f145c149c9-utilities\") pod \"certified-operators-bzmdm\" (UID: \"7ce243ff-d352-42f5-82b7-57f145c149c9\") " pod="openshift-marketplace/certified-operators-bzmdm" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.718521 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzbfn\" (UniqueName: \"kubernetes.io/projected/7ce243ff-d352-42f5-82b7-57f145c149c9-kube-api-access-xzbfn\") pod \"certified-operators-bzmdm\" (UID: \"7ce243ff-d352-42f5-82b7-57f145c149c9\") " pod="openshift-marketplace/certified-operators-bzmdm" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.718559 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ce243ff-d352-42f5-82b7-57f145c149c9-catalog-content\") pod \"certified-operators-bzmdm\" (UID: \"7ce243ff-d352-42f5-82b7-57f145c149c9\") " pod="openshift-marketplace/certified-operators-bzmdm" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.719011 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ce243ff-d352-42f5-82b7-57f145c149c9-utilities\") pod \"certified-operators-bzmdm\" (UID: \"7ce243ff-d352-42f5-82b7-57f145c149c9\") " pod="openshift-marketplace/certified-operators-bzmdm" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.719040 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ce243ff-d352-42f5-82b7-57f145c149c9-catalog-content\") pod \"certified-operators-bzmdm\" (UID: \"7ce243ff-d352-42f5-82b7-57f145c149c9\") " pod="openshift-marketplace/certified-operators-bzmdm" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.750762 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzbfn\" (UniqueName: \"kubernetes.io/projected/7ce243ff-d352-42f5-82b7-57f145c149c9-kube-api-access-xzbfn\") pod \"certified-operators-bzmdm\" (UID: \"7ce243ff-d352-42f5-82b7-57f145c149c9\") " pod="openshift-marketplace/certified-operators-bzmdm" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.819471 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-catalog-content\") pod \"community-operators-m2p45\" (UID: \"a0c4e613-dc17-42e5-ba34-58c07d69b3a0\") " pod="openshift-marketplace/community-operators-m2p45" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.819520 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9nxt\" (UniqueName: \"kubernetes.io/projected/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-kube-api-access-x9nxt\") pod \"community-operators-m2p45\" (UID: \"a0c4e613-dc17-42e5-ba34-58c07d69b3a0\") " pod="openshift-marketplace/community-operators-m2p45" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.820402 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-utilities\") pod \"community-operators-m2p45\" (UID: \"a0c4e613-dc17-42e5-ba34-58c07d69b3a0\") " pod="openshift-marketplace/community-operators-m2p45" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.821501 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bzmdm" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.921024 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-utilities\") pod \"community-operators-m2p45\" (UID: \"a0c4e613-dc17-42e5-ba34-58c07d69b3a0\") " pod="openshift-marketplace/community-operators-m2p45" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.921304 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-catalog-content\") pod \"community-operators-m2p45\" (UID: \"a0c4e613-dc17-42e5-ba34-58c07d69b3a0\") " pod="openshift-marketplace/community-operators-m2p45" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.921341 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9nxt\" (UniqueName: \"kubernetes.io/projected/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-kube-api-access-x9nxt\") pod \"community-operators-m2p45\" (UID: \"a0c4e613-dc17-42e5-ba34-58c07d69b3a0\") " pod="openshift-marketplace/community-operators-m2p45" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.921872 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-catalog-content\") pod \"community-operators-m2p45\" (UID: \"a0c4e613-dc17-42e5-ba34-58c07d69b3a0\") " pod="openshift-marketplace/community-operators-m2p45" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.921890 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-utilities\") pod \"community-operators-m2p45\" (UID: \"a0c4e613-dc17-42e5-ba34-58c07d69b3a0\") " pod="openshift-marketplace/community-operators-m2p45" Nov 29 14:35:04 crc kubenswrapper[4907]: I1129 14:35:04.949801 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9nxt\" (UniqueName: \"kubernetes.io/projected/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-kube-api-access-x9nxt\") pod \"community-operators-m2p45\" (UID: \"a0c4e613-dc17-42e5-ba34-58c07d69b3a0\") " pod="openshift-marketplace/community-operators-m2p45" Nov 29 14:35:05 crc kubenswrapper[4907]: I1129 14:35:05.018910 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2p45" Nov 29 14:35:05 crc kubenswrapper[4907]: I1129 14:35:05.165623 4907 generic.go:334] "Generic (PLEG): container finished" podID="5171ef24-3274-44d5-8d36-8d5be3534c2a" containerID="5414b4b166ab752e38a869a20e15187f81493bff2c702b1cebaa8b7cc4110378" exitCode=0 Nov 29 14:35:05 crc kubenswrapper[4907]: I1129 14:35:05.165685 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95kn8" event={"ID":"5171ef24-3274-44d5-8d36-8d5be3534c2a","Type":"ContainerDied","Data":"5414b4b166ab752e38a869a20e15187f81493bff2c702b1cebaa8b7cc4110378"} Nov 29 14:35:05 crc kubenswrapper[4907]: I1129 14:35:05.168611 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58pkc" event={"ID":"7315fc63-0710-4bcc-a67a-6c2c649192d0","Type":"ContainerStarted","Data":"3f665e2ffa8d4e8ffe3bb7abcf63bc958d3adf54def78b1982f135fb28b906c4"} Nov 29 14:35:05 crc kubenswrapper[4907]: I1129 14:35:05.309610 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bzmdm"] Nov 29 14:35:05 crc kubenswrapper[4907]: W1129 14:35:05.314634 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ce243ff_d352_42f5_82b7_57f145c149c9.slice/crio-cf843c55ff9ba9077ee9b709c53f65fdc311969dd58cb24da1fa6811b52f28e5 WatchSource:0}: Error finding container cf843c55ff9ba9077ee9b709c53f65fdc311969dd58cb24da1fa6811b52f28e5: Status 404 returned error can't find the container with id cf843c55ff9ba9077ee9b709c53f65fdc311969dd58cb24da1fa6811b52f28e5 Nov 29 14:35:05 crc kubenswrapper[4907]: W1129 14:35:05.457993 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0c4e613_dc17_42e5_ba34_58c07d69b3a0.slice/crio-21a36d66c4fc558284051ba091f42bbd44ce37d4e5fee5343b8ab95aaa61601a WatchSource:0}: Error finding container 21a36d66c4fc558284051ba091f42bbd44ce37d4e5fee5343b8ab95aaa61601a: Status 404 returned error can't find the container with id 21a36d66c4fc558284051ba091f42bbd44ce37d4e5fee5343b8ab95aaa61601a Nov 29 14:35:05 crc kubenswrapper[4907]: I1129 14:35:05.460740 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m2p45"] Nov 29 14:35:06 crc kubenswrapper[4907]: I1129 14:35:06.176959 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95kn8" event={"ID":"5171ef24-3274-44d5-8d36-8d5be3534c2a","Type":"ContainerStarted","Data":"db148fcf46c27b170f69e57f1a844406224e9ef5556323b642b9053053f0bb97"} Nov 29 14:35:06 crc kubenswrapper[4907]: I1129 14:35:06.178946 4907 generic.go:334] "Generic (PLEG): container finished" podID="a0c4e613-dc17-42e5-ba34-58c07d69b3a0" containerID="022bb455f0cf005c3b8f432c116afc990671d671eff7f59e9dc4be1464bbd7ef" exitCode=0 Nov 29 14:35:06 crc kubenswrapper[4907]: I1129 14:35:06.179031 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2p45" event={"ID":"a0c4e613-dc17-42e5-ba34-58c07d69b3a0","Type":"ContainerDied","Data":"022bb455f0cf005c3b8f432c116afc990671d671eff7f59e9dc4be1464bbd7ef"} Nov 29 14:35:06 crc kubenswrapper[4907]: I1129 14:35:06.179099 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2p45" event={"ID":"a0c4e613-dc17-42e5-ba34-58c07d69b3a0","Type":"ContainerStarted","Data":"21a36d66c4fc558284051ba091f42bbd44ce37d4e5fee5343b8ab95aaa61601a"} Nov 29 14:35:06 crc kubenswrapper[4907]: I1129 14:35:06.180868 4907 generic.go:334] "Generic (PLEG): container finished" podID="7315fc63-0710-4bcc-a67a-6c2c649192d0" containerID="3f665e2ffa8d4e8ffe3bb7abcf63bc958d3adf54def78b1982f135fb28b906c4" exitCode=0 Nov 29 14:35:06 crc kubenswrapper[4907]: I1129 14:35:06.181023 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58pkc" event={"ID":"7315fc63-0710-4bcc-a67a-6c2c649192d0","Type":"ContainerDied","Data":"3f665e2ffa8d4e8ffe3bb7abcf63bc958d3adf54def78b1982f135fb28b906c4"} Nov 29 14:35:06 crc kubenswrapper[4907]: I1129 14:35:06.183533 4907 generic.go:334] "Generic (PLEG): container finished" podID="7ce243ff-d352-42f5-82b7-57f145c149c9" containerID="0ba1367587d12c1f96a5bbbd216852d0a5324de202fd8830838c44e4899f8ff0" exitCode=0 Nov 29 14:35:06 crc kubenswrapper[4907]: I1129 14:35:06.183552 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzmdm" event={"ID":"7ce243ff-d352-42f5-82b7-57f145c149c9","Type":"ContainerDied","Data":"0ba1367587d12c1f96a5bbbd216852d0a5324de202fd8830838c44e4899f8ff0"} Nov 29 14:35:06 crc kubenswrapper[4907]: I1129 14:35:06.183569 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzmdm" event={"ID":"7ce243ff-d352-42f5-82b7-57f145c149c9","Type":"ContainerStarted","Data":"cf843c55ff9ba9077ee9b709c53f65fdc311969dd58cb24da1fa6811b52f28e5"} Nov 29 14:35:06 crc kubenswrapper[4907]: I1129 14:35:06.197773 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-95kn8" podStartSLOduration=1.588560548 podStartE2EDuration="4.197756219s" podCreationTimestamp="2025-11-29 14:35:02 +0000 UTC" firstStartedPulling="2025-11-29 14:35:03.14683795 +0000 UTC m=+401.133675602" lastFinishedPulling="2025-11-29 14:35:05.756033621 +0000 UTC m=+403.742871273" observedRunningTime="2025-11-29 14:35:06.196844558 +0000 UTC m=+404.183682210" watchObservedRunningTime="2025-11-29 14:35:06.197756219 +0000 UTC m=+404.184593871" Nov 29 14:35:07 crc kubenswrapper[4907]: I1129 14:35:07.191734 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzmdm" event={"ID":"7ce243ff-d352-42f5-82b7-57f145c149c9","Type":"ContainerStarted","Data":"ed8dbdcaf9222dbdf9e25346ffcf714773d38cca4f81dd86de29530fe0475125"} Nov 29 14:35:07 crc kubenswrapper[4907]: I1129 14:35:07.197172 4907 generic.go:334] "Generic (PLEG): container finished" podID="a0c4e613-dc17-42e5-ba34-58c07d69b3a0" containerID="f2b40057eba4bae848aece7cc28a0728b2c0dfea68b95a58555e36f8950ba2af" exitCode=0 Nov 29 14:35:07 crc kubenswrapper[4907]: I1129 14:35:07.197299 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2p45" event={"ID":"a0c4e613-dc17-42e5-ba34-58c07d69b3a0","Type":"ContainerDied","Data":"f2b40057eba4bae848aece7cc28a0728b2c0dfea68b95a58555e36f8950ba2af"} Nov 29 14:35:07 crc kubenswrapper[4907]: I1129 14:35:07.201592 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58pkc" event={"ID":"7315fc63-0710-4bcc-a67a-6c2c649192d0","Type":"ContainerStarted","Data":"523dfd1214084ab7b2ba311f8ab4f7f54bbede2c37cc87d7a3d62025d1037301"} Nov 29 14:35:07 crc kubenswrapper[4907]: I1129 14:35:07.250635 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-58pkc" podStartSLOduration=2.743611725 podStartE2EDuration="5.250618108s" podCreationTimestamp="2025-11-29 14:35:02 +0000 UTC" firstStartedPulling="2025-11-29 14:35:04.160014542 +0000 UTC m=+402.146852204" lastFinishedPulling="2025-11-29 14:35:06.667020895 +0000 UTC m=+404.653858587" observedRunningTime="2025-11-29 14:35:07.24525303 +0000 UTC m=+405.232090692" watchObservedRunningTime="2025-11-29 14:35:07.250618108 +0000 UTC m=+405.237455780" Nov 29 14:35:08 crc kubenswrapper[4907]: I1129 14:35:08.211033 4907 generic.go:334] "Generic (PLEG): container finished" podID="7ce243ff-d352-42f5-82b7-57f145c149c9" containerID="ed8dbdcaf9222dbdf9e25346ffcf714773d38cca4f81dd86de29530fe0475125" exitCode=0 Nov 29 14:35:08 crc kubenswrapper[4907]: I1129 14:35:08.211110 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzmdm" event={"ID":"7ce243ff-d352-42f5-82b7-57f145c149c9","Type":"ContainerDied","Data":"ed8dbdcaf9222dbdf9e25346ffcf714773d38cca4f81dd86de29530fe0475125"} Nov 29 14:35:08 crc kubenswrapper[4907]: I1129 14:35:08.215686 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2p45" event={"ID":"a0c4e613-dc17-42e5-ba34-58c07d69b3a0","Type":"ContainerStarted","Data":"173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a"} Nov 29 14:35:08 crc kubenswrapper[4907]: I1129 14:35:08.247174 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m2p45" podStartSLOduration=2.793022512 podStartE2EDuration="4.247155264s" podCreationTimestamp="2025-11-29 14:35:04 +0000 UTC" firstStartedPulling="2025-11-29 14:35:06.180767694 +0000 UTC m=+404.167605356" lastFinishedPulling="2025-11-29 14:35:07.634900456 +0000 UTC m=+405.621738108" observedRunningTime="2025-11-29 14:35:08.244964852 +0000 UTC m=+406.231802534" watchObservedRunningTime="2025-11-29 14:35:08.247155264 +0000 UTC m=+406.233992926" Nov 29 14:35:09 crc kubenswrapper[4907]: I1129 14:35:09.223047 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzmdm" event={"ID":"7ce243ff-d352-42f5-82b7-57f145c149c9","Type":"ContainerStarted","Data":"1d7405a4d18474bf945a305ac91eb6fe7f1451550cff96e6f36e8bc2b201f49f"} Nov 29 14:35:09 crc kubenswrapper[4907]: I1129 14:35:09.244760 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bzmdm" podStartSLOduration=2.528017179 podStartE2EDuration="5.244739845s" podCreationTimestamp="2025-11-29 14:35:04 +0000 UTC" firstStartedPulling="2025-11-29 14:35:06.185255931 +0000 UTC m=+404.172093583" lastFinishedPulling="2025-11-29 14:35:08.901978607 +0000 UTC m=+406.888816249" observedRunningTime="2025-11-29 14:35:09.240520864 +0000 UTC m=+407.227358516" watchObservedRunningTime="2025-11-29 14:35:09.244739845 +0000 UTC m=+407.231577507" Nov 29 14:35:12 crc kubenswrapper[4907]: I1129 14:35:12.397238 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-95kn8" Nov 29 14:35:12 crc kubenswrapper[4907]: I1129 14:35:12.397888 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-95kn8" Nov 29 14:35:12 crc kubenswrapper[4907]: I1129 14:35:12.467799 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-95kn8" Nov 29 14:35:12 crc kubenswrapper[4907]: I1129 14:35:12.600876 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-58pkc" Nov 29 14:35:12 crc kubenswrapper[4907]: I1129 14:35:12.602743 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-58pkc" Nov 29 14:35:12 crc kubenswrapper[4907]: I1129 14:35:12.665468 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-58pkc" Nov 29 14:35:13 crc kubenswrapper[4907]: I1129 14:35:13.304980 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-95kn8" Nov 29 14:35:13 crc kubenswrapper[4907]: I1129 14:35:13.309292 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-58pkc" Nov 29 14:35:14 crc kubenswrapper[4907]: I1129 14:35:14.823657 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bzmdm" Nov 29 14:35:14 crc kubenswrapper[4907]: I1129 14:35:14.824208 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bzmdm" Nov 29 14:35:14 crc kubenswrapper[4907]: I1129 14:35:14.883086 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bzmdm" Nov 29 14:35:15 crc kubenswrapper[4907]: I1129 14:35:15.019092 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m2p45" Nov 29 14:35:15 crc kubenswrapper[4907]: I1129 14:35:15.019502 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m2p45" Nov 29 14:35:15 crc kubenswrapper[4907]: I1129 14:35:15.078867 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m2p45" Nov 29 14:35:15 crc kubenswrapper[4907]: I1129 14:35:15.299829 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bzmdm" Nov 29 14:35:15 crc kubenswrapper[4907]: I1129 14:35:15.301606 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m2p45" Nov 29 14:35:28 crc kubenswrapper[4907]: I1129 14:35:28.490979 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:35:28 crc kubenswrapper[4907]: I1129 14:35:28.491648 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:35:28 crc kubenswrapper[4907]: I1129 14:35:28.491723 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:35:28 crc kubenswrapper[4907]: I1129 14:35:28.492871 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a456b93cdbff1001e9ff31e71e560207b63f5cbe6f442049caf8634aa78242ee"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 14:35:28 crc kubenswrapper[4907]: I1129 14:35:28.492980 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://a456b93cdbff1001e9ff31e71e560207b63f5cbe6f442049caf8634aa78242ee" gracePeriod=600 Nov 29 14:35:29 crc kubenswrapper[4907]: I1129 14:35:29.338014 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="a456b93cdbff1001e9ff31e71e560207b63f5cbe6f442049caf8634aa78242ee" exitCode=0 Nov 29 14:35:29 crc kubenswrapper[4907]: I1129 14:35:29.338122 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"a456b93cdbff1001e9ff31e71e560207b63f5cbe6f442049caf8634aa78242ee"} Nov 29 14:35:29 crc kubenswrapper[4907]: I1129 14:35:29.338541 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"6266188cd3801cb79e9076cc411ccc7b4b18d94f48d528ad87b448fafd9cdc7d"} Nov 29 14:35:29 crc kubenswrapper[4907]: I1129 14:35:29.338597 4907 scope.go:117] "RemoveContainer" containerID="9bd0b4a995c022a1446adc8bf35f9d3a6f507303f24e70de834bd49404482c6f" Nov 29 14:35:29 crc kubenswrapper[4907]: I1129 14:35:29.962165 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j"] Nov 29 14:35:29 crc kubenswrapper[4907]: I1129 14:35:29.963678 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j" Nov 29 14:35:29 crc kubenswrapper[4907]: I1129 14:35:29.969481 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Nov 29 14:35:29 crc kubenswrapper[4907]: I1129 14:35:29.969885 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Nov 29 14:35:29 crc kubenswrapper[4907]: I1129 14:35:29.970385 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Nov 29 14:35:29 crc kubenswrapper[4907]: I1129 14:35:29.970484 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Nov 29 14:35:29 crc kubenswrapper[4907]: I1129 14:35:29.970633 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Nov 29 14:35:29 crc kubenswrapper[4907]: I1129 14:35:29.980328 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j"] Nov 29 14:35:30 crc kubenswrapper[4907]: I1129 14:35:30.046896 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d451fe98-ff6e-40d3-818f-6aa4aaabcf98-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-pg25j\" (UID: \"d451fe98-ff6e-40d3-818f-6aa4aaabcf98\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j" Nov 29 14:35:30 crc kubenswrapper[4907]: I1129 14:35:30.046984 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d451fe98-ff6e-40d3-818f-6aa4aaabcf98-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-pg25j\" (UID: \"d451fe98-ff6e-40d3-818f-6aa4aaabcf98\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j" Nov 29 14:35:30 crc kubenswrapper[4907]: I1129 14:35:30.047036 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flnxn\" (UniqueName: \"kubernetes.io/projected/d451fe98-ff6e-40d3-818f-6aa4aaabcf98-kube-api-access-flnxn\") pod \"cluster-monitoring-operator-6d5b84845-pg25j\" (UID: \"d451fe98-ff6e-40d3-818f-6aa4aaabcf98\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j" Nov 29 14:35:30 crc kubenswrapper[4907]: I1129 14:35:30.148732 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d451fe98-ff6e-40d3-818f-6aa4aaabcf98-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-pg25j\" (UID: \"d451fe98-ff6e-40d3-818f-6aa4aaabcf98\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j" Nov 29 14:35:30 crc kubenswrapper[4907]: I1129 14:35:30.148839 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flnxn\" (UniqueName: \"kubernetes.io/projected/d451fe98-ff6e-40d3-818f-6aa4aaabcf98-kube-api-access-flnxn\") pod \"cluster-monitoring-operator-6d5b84845-pg25j\" (UID: \"d451fe98-ff6e-40d3-818f-6aa4aaabcf98\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j" Nov 29 14:35:30 crc kubenswrapper[4907]: I1129 14:35:30.148966 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d451fe98-ff6e-40d3-818f-6aa4aaabcf98-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-pg25j\" (UID: \"d451fe98-ff6e-40d3-818f-6aa4aaabcf98\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j" Nov 29 14:35:30 crc kubenswrapper[4907]: I1129 14:35:30.150575 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d451fe98-ff6e-40d3-818f-6aa4aaabcf98-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-pg25j\" (UID: \"d451fe98-ff6e-40d3-818f-6aa4aaabcf98\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j" Nov 29 14:35:30 crc kubenswrapper[4907]: I1129 14:35:30.159946 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d451fe98-ff6e-40d3-818f-6aa4aaabcf98-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-pg25j\" (UID: \"d451fe98-ff6e-40d3-818f-6aa4aaabcf98\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j" Nov 29 14:35:30 crc kubenswrapper[4907]: I1129 14:35:30.168262 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flnxn\" (UniqueName: \"kubernetes.io/projected/d451fe98-ff6e-40d3-818f-6aa4aaabcf98-kube-api-access-flnxn\") pod \"cluster-monitoring-operator-6d5b84845-pg25j\" (UID: \"d451fe98-ff6e-40d3-818f-6aa4aaabcf98\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j" Nov 29 14:35:30 crc kubenswrapper[4907]: I1129 14:35:30.297698 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j" Nov 29 14:35:30 crc kubenswrapper[4907]: I1129 14:35:30.577429 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j"] Nov 29 14:35:31 crc kubenswrapper[4907]: I1129 14:35:31.364048 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j" event={"ID":"d451fe98-ff6e-40d3-818f-6aa4aaabcf98","Type":"ContainerStarted","Data":"79db78d77f04a68d68b2d9527822da53ff898caf8c933652956b502fe1f5d833"} Nov 29 14:35:33 crc kubenswrapper[4907]: I1129 14:35:33.103746 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fpxw"] Nov 29 14:35:33 crc kubenswrapper[4907]: I1129 14:35:33.104819 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fpxw" Nov 29 14:35:33 crc kubenswrapper[4907]: I1129 14:35:33.106806 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Nov 29 14:35:33 crc kubenswrapper[4907]: I1129 14:35:33.107099 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-kvcpp" Nov 29 14:35:33 crc kubenswrapper[4907]: I1129 14:35:33.125728 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fpxw"] Nov 29 14:35:33 crc kubenswrapper[4907]: I1129 14:35:33.189052 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e159e4fc-3e0f-4cce-9afb-dc00a3eac5e7-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8fpxw\" (UID: \"e159e4fc-3e0f-4cce-9afb-dc00a3eac5e7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fpxw" Nov 29 14:35:33 crc kubenswrapper[4907]: I1129 14:35:33.290250 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e159e4fc-3e0f-4cce-9afb-dc00a3eac5e7-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8fpxw\" (UID: \"e159e4fc-3e0f-4cce-9afb-dc00a3eac5e7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fpxw" Nov 29 14:35:33 crc kubenswrapper[4907]: I1129 14:35:33.297815 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e159e4fc-3e0f-4cce-9afb-dc00a3eac5e7-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8fpxw\" (UID: \"e159e4fc-3e0f-4cce-9afb-dc00a3eac5e7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fpxw" Nov 29 14:35:33 crc kubenswrapper[4907]: I1129 14:35:33.378105 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j" event={"ID":"d451fe98-ff6e-40d3-818f-6aa4aaabcf98","Type":"ContainerStarted","Data":"02163964936b456fbe801c8f116b7c5d79164a0bd89f8082a8dfa12ce4867694"} Nov 29 14:35:33 crc kubenswrapper[4907]: I1129 14:35:33.396010 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-pg25j" podStartSLOduration=2.59952058 podStartE2EDuration="4.39598001s" podCreationTimestamp="2025-11-29 14:35:29 +0000 UTC" firstStartedPulling="2025-11-29 14:35:30.616248481 +0000 UTC m=+428.603086143" lastFinishedPulling="2025-11-29 14:35:32.412707921 +0000 UTC m=+430.399545573" observedRunningTime="2025-11-29 14:35:33.394568897 +0000 UTC m=+431.381406579" watchObservedRunningTime="2025-11-29 14:35:33.39598001 +0000 UTC m=+431.382817702" Nov 29 14:35:33 crc kubenswrapper[4907]: I1129 14:35:33.418358 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fpxw" Nov 29 14:35:33 crc kubenswrapper[4907]: I1129 14:35:33.699095 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fpxw"] Nov 29 14:35:34 crc kubenswrapper[4907]: I1129 14:35:34.388282 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fpxw" event={"ID":"e159e4fc-3e0f-4cce-9afb-dc00a3eac5e7","Type":"ContainerStarted","Data":"8efc0c0b0fb0076a5c45f7ec83399e348c244380aa8ca3ec588b586411638e3a"} Nov 29 14:35:36 crc kubenswrapper[4907]: I1129 14:35:36.405145 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fpxw" event={"ID":"e159e4fc-3e0f-4cce-9afb-dc00a3eac5e7","Type":"ContainerStarted","Data":"52f47a621eecfaef8e1111600c671b3a50ff6f37288e464b4db7265d3c97ca1e"} Nov 29 14:35:36 crc kubenswrapper[4907]: I1129 14:35:36.405690 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fpxw" Nov 29 14:35:36 crc kubenswrapper[4907]: I1129 14:35:36.413914 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fpxw" Nov 29 14:35:36 crc kubenswrapper[4907]: I1129 14:35:36.432678 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8fpxw" podStartSLOduration=1.839063299 podStartE2EDuration="3.43265421s" podCreationTimestamp="2025-11-29 14:35:33 +0000 UTC" firstStartedPulling="2025-11-29 14:35:33.710037563 +0000 UTC m=+431.696875205" lastFinishedPulling="2025-11-29 14:35:35.303628464 +0000 UTC m=+433.290466116" observedRunningTime="2025-11-29 14:35:36.427054086 +0000 UTC m=+434.413891768" watchObservedRunningTime="2025-11-29 14:35:36.43265421 +0000 UTC m=+434.419491892" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.181258 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-45ztv"] Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.182937 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.185593 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.188388 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.188763 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.191755 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-f7nnr" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.198567 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-45ztv"] Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.254116 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87cpx\" (UniqueName: \"kubernetes.io/projected/4d9c84b5-793c-46f3-a588-db5ac736fe12-kube-api-access-87cpx\") pod \"prometheus-operator-db54df47d-45ztv\" (UID: \"4d9c84b5-793c-46f3-a588-db5ac736fe12\") " pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.254389 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d9c84b5-793c-46f3-a588-db5ac736fe12-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-45ztv\" (UID: \"4d9c84b5-793c-46f3-a588-db5ac736fe12\") " pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.254514 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d9c84b5-793c-46f3-a588-db5ac736fe12-metrics-client-ca\") pod \"prometheus-operator-db54df47d-45ztv\" (UID: \"4d9c84b5-793c-46f3-a588-db5ac736fe12\") " pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.254612 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4d9c84b5-793c-46f3-a588-db5ac736fe12-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-45ztv\" (UID: \"4d9c84b5-793c-46f3-a588-db5ac736fe12\") " pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.355755 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4d9c84b5-793c-46f3-a588-db5ac736fe12-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-45ztv\" (UID: \"4d9c84b5-793c-46f3-a588-db5ac736fe12\") " pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.355873 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87cpx\" (UniqueName: \"kubernetes.io/projected/4d9c84b5-793c-46f3-a588-db5ac736fe12-kube-api-access-87cpx\") pod \"prometheus-operator-db54df47d-45ztv\" (UID: \"4d9c84b5-793c-46f3-a588-db5ac736fe12\") " pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.355991 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d9c84b5-793c-46f3-a588-db5ac736fe12-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-45ztv\" (UID: \"4d9c84b5-793c-46f3-a588-db5ac736fe12\") " pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.356033 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d9c84b5-793c-46f3-a588-db5ac736fe12-metrics-client-ca\") pod \"prometheus-operator-db54df47d-45ztv\" (UID: \"4d9c84b5-793c-46f3-a588-db5ac736fe12\") " pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" Nov 29 14:35:37 crc kubenswrapper[4907]: E1129 14:35:37.356551 4907 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Nov 29 14:35:37 crc kubenswrapper[4907]: E1129 14:35:37.356684 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d9c84b5-793c-46f3-a588-db5ac736fe12-prometheus-operator-tls podName:4d9c84b5-793c-46f3-a588-db5ac736fe12 nodeName:}" failed. No retries permitted until 2025-11-29 14:35:37.856654295 +0000 UTC m=+435.843491977 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/4d9c84b5-793c-46f3-a588-db5ac736fe12-prometheus-operator-tls") pod "prometheus-operator-db54df47d-45ztv" (UID: "4d9c84b5-793c-46f3-a588-db5ac736fe12") : secret "prometheus-operator-tls" not found Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.358106 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d9c84b5-793c-46f3-a588-db5ac736fe12-metrics-client-ca\") pod \"prometheus-operator-db54df47d-45ztv\" (UID: \"4d9c84b5-793c-46f3-a588-db5ac736fe12\") " pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.377665 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87cpx\" (UniqueName: \"kubernetes.io/projected/4d9c84b5-793c-46f3-a588-db5ac736fe12-kube-api-access-87cpx\") pod \"prometheus-operator-db54df47d-45ztv\" (UID: \"4d9c84b5-793c-46f3-a588-db5ac736fe12\") " pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.389808 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4d9c84b5-793c-46f3-a588-db5ac736fe12-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-45ztv\" (UID: \"4d9c84b5-793c-46f3-a588-db5ac736fe12\") " pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.866105 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d9c84b5-793c-46f3-a588-db5ac736fe12-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-45ztv\" (UID: \"4d9c84b5-793c-46f3-a588-db5ac736fe12\") " pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" Nov 29 14:35:37 crc kubenswrapper[4907]: I1129 14:35:37.871281 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d9c84b5-793c-46f3-a588-db5ac736fe12-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-45ztv\" (UID: \"4d9c84b5-793c-46f3-a588-db5ac736fe12\") " pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" Nov 29 14:35:38 crc kubenswrapper[4907]: I1129 14:35:38.145482 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" Nov 29 14:35:38 crc kubenswrapper[4907]: I1129 14:35:38.693066 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-45ztv"] Nov 29 14:35:38 crc kubenswrapper[4907]: W1129 14:35:38.704782 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d9c84b5_793c_46f3_a588_db5ac736fe12.slice/crio-cc16aefe544820e2f0cf5970fe3d39e4143920fae068b4edc24d9020216a61e1 WatchSource:0}: Error finding container cc16aefe544820e2f0cf5970fe3d39e4143920fae068b4edc24d9020216a61e1: Status 404 returned error can't find the container with id cc16aefe544820e2f0cf5970fe3d39e4143920fae068b4edc24d9020216a61e1 Nov 29 14:35:39 crc kubenswrapper[4907]: I1129 14:35:39.429655 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" event={"ID":"4d9c84b5-793c-46f3-a588-db5ac736fe12","Type":"ContainerStarted","Data":"cc16aefe544820e2f0cf5970fe3d39e4143920fae068b4edc24d9020216a61e1"} Nov 29 14:35:41 crc kubenswrapper[4907]: I1129 14:35:41.447509 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" event={"ID":"4d9c84b5-793c-46f3-a588-db5ac736fe12","Type":"ContainerStarted","Data":"df0da25e2d8b3f50698031895fd957d695a5d1df5a32f69bf5bfe0c1413e5651"} Nov 29 14:35:41 crc kubenswrapper[4907]: I1129 14:35:41.447899 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" event={"ID":"4d9c84b5-793c-46f3-a588-db5ac736fe12","Type":"ContainerStarted","Data":"c166612f983ca2496ac4a49f375d3a91abf06c0352e83e252335d815095d6e34"} Nov 29 14:35:41 crc kubenswrapper[4907]: I1129 14:35:41.481612 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-45ztv" podStartSLOduration=2.201310056 podStartE2EDuration="4.481587119s" podCreationTimestamp="2025-11-29 14:35:37 +0000 UTC" firstStartedPulling="2025-11-29 14:35:38.709564094 +0000 UTC m=+436.696401756" lastFinishedPulling="2025-11-29 14:35:40.989841137 +0000 UTC m=+438.976678819" observedRunningTime="2025-11-29 14:35:41.478187788 +0000 UTC m=+439.465025480" watchObservedRunningTime="2025-11-29 14:35:41.481587119 +0000 UTC m=+439.468424801" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.562852 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-l59tk"] Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.564423 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.566706 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.567183 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-c7k4m" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.567369 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.587264 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-l59tk"] Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.605214 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p"] Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.606703 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.608249 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.608428 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.608564 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.610799 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-42p95" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.621294 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p"] Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.650349 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zhchg"] Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.651896 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.654203 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.656913 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-b8s7p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.658132 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.677248 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh85j\" (UniqueName: \"kubernetes.io/projected/ec0e8965-93a7-46fa-9b06-a886fc03ad92-kube-api-access-wh85j\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.677300 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec0e8965-93a7-46fa-9b06-a886fc03ad92-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.677337 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09cf8bd5-3cce-4761-89a4-a155e3ef0032-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-l59tk\" (UID: \"09cf8bd5-3cce-4761-89a4-a155e3ef0032\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.677369 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ec0e8965-93a7-46fa-9b06-a886fc03ad92-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.677398 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec0e8965-93a7-46fa-9b06-a886fc03ad92-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.677576 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09cf8bd5-3cce-4761-89a4-a155e3ef0032-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-l59tk\" (UID: \"09cf8bd5-3cce-4761-89a4-a155e3ef0032\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.677614 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p97l8\" (UniqueName: \"kubernetes.io/projected/09cf8bd5-3cce-4761-89a4-a155e3ef0032-kube-api-access-p97l8\") pod \"openshift-state-metrics-566fddb674-l59tk\" (UID: \"09cf8bd5-3cce-4761-89a4-a155e3ef0032\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.677653 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09cf8bd5-3cce-4761-89a4-a155e3ef0032-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-l59tk\" (UID: \"09cf8bd5-3cce-4761-89a4-a155e3ef0032\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.677679 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ec0e8965-93a7-46fa-9b06-a886fc03ad92-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.677739 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ec0e8965-93a7-46fa-9b06-a886fc03ad92-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779169 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d1a651c1-e510-457d-86a0-b64a76d8fe5c-node-exporter-tls\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779219 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfrwv\" (UniqueName: \"kubernetes.io/projected/d1a651c1-e510-457d-86a0-b64a76d8fe5c-kube-api-access-bfrwv\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779270 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ec0e8965-93a7-46fa-9b06-a886fc03ad92-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779305 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d1a651c1-e510-457d-86a0-b64a76d8fe5c-node-exporter-wtmp\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779335 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh85j\" (UniqueName: \"kubernetes.io/projected/ec0e8965-93a7-46fa-9b06-a886fc03ad92-kube-api-access-wh85j\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779377 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec0e8965-93a7-46fa-9b06-a886fc03ad92-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779409 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09cf8bd5-3cce-4761-89a4-a155e3ef0032-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-l59tk\" (UID: \"09cf8bd5-3cce-4761-89a4-a155e3ef0032\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779433 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ec0e8965-93a7-46fa-9b06-a886fc03ad92-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779473 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec0e8965-93a7-46fa-9b06-a886fc03ad92-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779508 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09cf8bd5-3cce-4761-89a4-a155e3ef0032-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-l59tk\" (UID: \"09cf8bd5-3cce-4761-89a4-a155e3ef0032\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779537 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p97l8\" (UniqueName: \"kubernetes.io/projected/09cf8bd5-3cce-4761-89a4-a155e3ef0032-kube-api-access-p97l8\") pod \"openshift-state-metrics-566fddb674-l59tk\" (UID: \"09cf8bd5-3cce-4761-89a4-a155e3ef0032\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779569 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1a651c1-e510-457d-86a0-b64a76d8fe5c-sys\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779595 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d1a651c1-e510-457d-86a0-b64a76d8fe5c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779619 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a651c1-e510-457d-86a0-b64a76d8fe5c-metrics-client-ca\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779640 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d1a651c1-e510-457d-86a0-b64a76d8fe5c-root\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779661 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09cf8bd5-3cce-4761-89a4-a155e3ef0032-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-l59tk\" (UID: \"09cf8bd5-3cce-4761-89a4-a155e3ef0032\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779682 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d1a651c1-e510-457d-86a0-b64a76d8fe5c-node-exporter-textfile\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.779705 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ec0e8965-93a7-46fa-9b06-a886fc03ad92-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.780111 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ec0e8965-93a7-46fa-9b06-a886fc03ad92-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.780926 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09cf8bd5-3cce-4761-89a4-a155e3ef0032-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-l59tk\" (UID: \"09cf8bd5-3cce-4761-89a4-a155e3ef0032\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.780988 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec0e8965-93a7-46fa-9b06-a886fc03ad92-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.781496 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ec0e8965-93a7-46fa-9b06-a886fc03ad92-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.792008 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09cf8bd5-3cce-4761-89a4-a155e3ef0032-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-l59tk\" (UID: \"09cf8bd5-3cce-4761-89a4-a155e3ef0032\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.798035 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09cf8bd5-3cce-4761-89a4-a155e3ef0032-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-l59tk\" (UID: \"09cf8bd5-3cce-4761-89a4-a155e3ef0032\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.803143 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p97l8\" (UniqueName: \"kubernetes.io/projected/09cf8bd5-3cce-4761-89a4-a155e3ef0032-kube-api-access-p97l8\") pod \"openshift-state-metrics-566fddb674-l59tk\" (UID: \"09cf8bd5-3cce-4761-89a4-a155e3ef0032\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.803249 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ec0e8965-93a7-46fa-9b06-a886fc03ad92-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.806407 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec0e8965-93a7-46fa-9b06-a886fc03ad92-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.824934 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh85j\" (UniqueName: \"kubernetes.io/projected/ec0e8965-93a7-46fa-9b06-a886fc03ad92-kube-api-access-wh85j\") pod \"kube-state-metrics-777cb5bd5d-8sf5p\" (UID: \"ec0e8965-93a7-46fa-9b06-a886fc03ad92\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.881056 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d1a651c1-e510-457d-86a0-b64a76d8fe5c-node-exporter-textfile\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.881171 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d1a651c1-e510-457d-86a0-b64a76d8fe5c-node-exporter-tls\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.881225 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfrwv\" (UniqueName: \"kubernetes.io/projected/d1a651c1-e510-457d-86a0-b64a76d8fe5c-kube-api-access-bfrwv\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.881277 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d1a651c1-e510-457d-86a0-b64a76d8fe5c-node-exporter-wtmp\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.881349 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1a651c1-e510-457d-86a0-b64a76d8fe5c-sys\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.881383 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d1a651c1-e510-457d-86a0-b64a76d8fe5c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.881413 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a651c1-e510-457d-86a0-b64a76d8fe5c-metrics-client-ca\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.881464 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d1a651c1-e510-457d-86a0-b64a76d8fe5c-root\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.881571 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d1a651c1-e510-457d-86a0-b64a76d8fe5c-root\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.881633 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1a651c1-e510-457d-86a0-b64a76d8fe5c-sys\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.881668 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d1a651c1-e510-457d-86a0-b64a76d8fe5c-node-exporter-textfile\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.881692 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d1a651c1-e510-457d-86a0-b64a76d8fe5c-node-exporter-wtmp\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.882106 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.882569 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a651c1-e510-457d-86a0-b64a76d8fe5c-metrics-client-ca\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.887958 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d1a651c1-e510-457d-86a0-b64a76d8fe5c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.887997 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d1a651c1-e510-457d-86a0-b64a76d8fe5c-node-exporter-tls\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.903007 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfrwv\" (UniqueName: \"kubernetes.io/projected/d1a651c1-e510-457d-86a0-b64a76d8fe5c-kube-api-access-bfrwv\") pod \"node-exporter-zhchg\" (UID: \"d1a651c1-e510-457d-86a0-b64a76d8fe5c\") " pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.924493 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" Nov 29 14:35:43 crc kubenswrapper[4907]: I1129 14:35:43.970084 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zhchg" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.350481 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-l59tk"] Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.436600 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p"] Nov 29 14:35:44 crc kubenswrapper[4907]: W1129 14:35:44.443406 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0e8965_93a7_46fa_9b06_a886fc03ad92.slice/crio-ee8258e682683fe6add56cbbcb4dec7e6585b945eeb154e79ea83b609a4d3212 WatchSource:0}: Error finding container ee8258e682683fe6add56cbbcb4dec7e6585b945eeb154e79ea83b609a4d3212: Status 404 returned error can't find the container with id ee8258e682683fe6add56cbbcb4dec7e6585b945eeb154e79ea83b609a4d3212 Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.477770 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" event={"ID":"ec0e8965-93a7-46fa-9b06-a886fc03ad92","Type":"ContainerStarted","Data":"ee8258e682683fe6add56cbbcb4dec7e6585b945eeb154e79ea83b609a4d3212"} Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.487367 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" event={"ID":"09cf8bd5-3cce-4761-89a4-a155e3ef0032","Type":"ContainerStarted","Data":"e5846e6498e5e2021958ad5c26cd45021f9ab7718d0d084cce553e3caeda4f46"} Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.487420 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zhchg" event={"ID":"d1a651c1-e510-457d-86a0-b64a76d8fe5c","Type":"ContainerStarted","Data":"138219e513ad182aa22c603c99b36299c3e678c3328db88eb051d2cbacc9d1ea"} Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.672623 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.675618 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.678304 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.679865 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.679980 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.680092 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.683812 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.684682 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.686365 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-v9th4" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.696195 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.713838 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.728255 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.819259 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cebfb025-6db9-479b-86e0-1f0044011f24-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.819311 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-web-config\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.819336 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cebfb025-6db9-479b-86e0-1f0044011f24-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.819365 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-config-volume\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.819383 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlx48\" (UniqueName: \"kubernetes.io/projected/cebfb025-6db9-479b-86e0-1f0044011f24-kube-api-access-vlx48\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.819417 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.819470 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.819489 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.819509 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cebfb025-6db9-479b-86e0-1f0044011f24-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.819532 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cebfb025-6db9-479b-86e0-1f0044011f24-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.819548 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.819566 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cebfb025-6db9-479b-86e0-1f0044011f24-config-out\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.921311 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.921382 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cebfb025-6db9-479b-86e0-1f0044011f24-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.921419 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cebfb025-6db9-479b-86e0-1f0044011f24-config-out\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.921476 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cebfb025-6db9-479b-86e0-1f0044011f24-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.921502 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cebfb025-6db9-479b-86e0-1f0044011f24-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.921527 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-web-config\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.921567 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-config-volume\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.921592 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlx48\" (UniqueName: \"kubernetes.io/projected/cebfb025-6db9-479b-86e0-1f0044011f24-kube-api-access-vlx48\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.921639 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.921682 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.921713 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.921742 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cebfb025-6db9-479b-86e0-1f0044011f24-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.925086 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cebfb025-6db9-479b-86e0-1f0044011f24-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.925906 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cebfb025-6db9-479b-86e0-1f0044011f24-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.926824 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cebfb025-6db9-479b-86e0-1f0044011f24-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.930079 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cebfb025-6db9-479b-86e0-1f0044011f24-config-out\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.930522 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-config-volume\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.930544 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.930874 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cebfb025-6db9-479b-86e0-1f0044011f24-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.932248 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.932292 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-web-config\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.934720 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.940212 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlx48\" (UniqueName: \"kubernetes.io/projected/cebfb025-6db9-479b-86e0-1f0044011f24-kube-api-access-vlx48\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:44 crc kubenswrapper[4907]: I1129 14:35:44.948852 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cebfb025-6db9-479b-86e0-1f0044011f24-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cebfb025-6db9-479b-86e0-1f0044011f24\") " pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.030183 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.487296 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" event={"ID":"09cf8bd5-3cce-4761-89a4-a155e3ef0032","Type":"ContainerStarted","Data":"bea6999e6a3270d3f7fab42e707dfc3b932a99eb4cacc44a6fcb6d162cd496e5"} Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.487835 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" event={"ID":"09cf8bd5-3cce-4761-89a4-a155e3ef0032","Type":"ContainerStarted","Data":"46e22590f3eb63b49f801041de3a85b75f11ba018695dfb566a5da783b39ce84"} Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.492024 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zhchg" event={"ID":"d1a651c1-e510-457d-86a0-b64a76d8fe5c","Type":"ContainerStarted","Data":"4cfbf44155f7df2e59302ba052cd4e9a9e2f2b8fb7077eff63e80a62b1dd192f"} Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.658911 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Nov 29 14:35:45 crc kubenswrapper[4907]: W1129 14:35:45.660790 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcebfb025_6db9_479b_86e0_1f0044011f24.slice/crio-e943690e54e6280389280273b420a531d11c257f1691c162bd7dafb4049e894e WatchSource:0}: Error finding container e943690e54e6280389280273b420a531d11c257f1691c162bd7dafb4049e894e: Status 404 returned error can't find the container with id e943690e54e6280389280273b420a531d11c257f1691c162bd7dafb4049e894e Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.702415 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6fc999d58d-xndt8"] Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.704059 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.710137 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.710197 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-8813cck8ndhn8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.710230 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.710308 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.710556 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-dls22" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.710725 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.712728 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.723875 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6fc999d58d-xndt8"] Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.833218 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.833280 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.833342 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-thanos-querier-tls\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.833450 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-grpc-tls\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.833562 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.833614 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5478c134-753d-455a-9b4a-cfa5aeb41239-metrics-client-ca\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.833665 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt5gp\" (UniqueName: \"kubernetes.io/projected/5478c134-753d-455a-9b4a-cfa5aeb41239-kube-api-access-xt5gp\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.833779 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.935547 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt5gp\" (UniqueName: \"kubernetes.io/projected/5478c134-753d-455a-9b4a-cfa5aeb41239-kube-api-access-xt5gp\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.935613 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.935651 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.935677 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.935697 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-thanos-querier-tls\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.935721 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-grpc-tls\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.935748 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.935767 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5478c134-753d-455a-9b4a-cfa5aeb41239-metrics-client-ca\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.936675 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5478c134-753d-455a-9b4a-cfa5aeb41239-metrics-client-ca\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.942553 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.943295 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-grpc-tls\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.943307 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.944360 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.953589 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt5gp\" (UniqueName: \"kubernetes.io/projected/5478c134-753d-455a-9b4a-cfa5aeb41239-kube-api-access-xt5gp\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.955237 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:45 crc kubenswrapper[4907]: I1129 14:35:45.956949 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5478c134-753d-455a-9b4a-cfa5aeb41239-secret-thanos-querier-tls\") pod \"thanos-querier-6fc999d58d-xndt8\" (UID: \"5478c134-753d-455a-9b4a-cfa5aeb41239\") " pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:46 crc kubenswrapper[4907]: I1129 14:35:46.076243 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:46 crc kubenswrapper[4907]: I1129 14:35:46.498895 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cebfb025-6db9-479b-86e0-1f0044011f24","Type":"ContainerStarted","Data":"e943690e54e6280389280273b420a531d11c257f1691c162bd7dafb4049e894e"} Nov 29 14:35:46 crc kubenswrapper[4907]: I1129 14:35:46.500740 4907 generic.go:334] "Generic (PLEG): container finished" podID="d1a651c1-e510-457d-86a0-b64a76d8fe5c" containerID="4cfbf44155f7df2e59302ba052cd4e9a9e2f2b8fb7077eff63e80a62b1dd192f" exitCode=0 Nov 29 14:35:46 crc kubenswrapper[4907]: I1129 14:35:46.500783 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zhchg" event={"ID":"d1a651c1-e510-457d-86a0-b64a76d8fe5c","Type":"ContainerDied","Data":"4cfbf44155f7df2e59302ba052cd4e9a9e2f2b8fb7077eff63e80a62b1dd192f"} Nov 29 14:35:47 crc kubenswrapper[4907]: I1129 14:35:47.327199 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6fc999d58d-xndt8"] Nov 29 14:35:47 crc kubenswrapper[4907]: I1129 14:35:47.509564 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" event={"ID":"09cf8bd5-3cce-4761-89a4-a155e3ef0032","Type":"ContainerStarted","Data":"f7412c737c96f3dde8aaa07bfa8dc5ea70e985ef1d7ea7cb14947a68aecb25d4"} Nov 29 14:35:47 crc kubenswrapper[4907]: I1129 14:35:47.512524 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zhchg" event={"ID":"d1a651c1-e510-457d-86a0-b64a76d8fe5c","Type":"ContainerStarted","Data":"09982026d2e740fd6dd85769b46838544989958fda54b87395900acc4232a453"} Nov 29 14:35:47 crc kubenswrapper[4907]: I1129 14:35:47.512553 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zhchg" event={"ID":"d1a651c1-e510-457d-86a0-b64a76d8fe5c","Type":"ContainerStarted","Data":"28cdfa32ec944a5fa5dae89af3027cb5c2858113b00a53f11bea49fa20cdc843"} Nov 29 14:35:47 crc kubenswrapper[4907]: I1129 14:35:47.514637 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" event={"ID":"5478c134-753d-455a-9b4a-cfa5aeb41239","Type":"ContainerStarted","Data":"85231f66c3f39607b5eea266be4ed91b3201200c010517e79a37c65964941b10"} Nov 29 14:35:47 crc kubenswrapper[4907]: I1129 14:35:47.517173 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" event={"ID":"ec0e8965-93a7-46fa-9b06-a886fc03ad92","Type":"ContainerStarted","Data":"f95ef77ca362e3b32da2b58e0d95288b990320e59c7db5848dde7916fcfab09c"} Nov 29 14:35:47 crc kubenswrapper[4907]: I1129 14:35:47.517204 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" event={"ID":"ec0e8965-93a7-46fa-9b06-a886fc03ad92","Type":"ContainerStarted","Data":"c7733dbe8306a4da674b437a8d8a8251343cac96d1d4274345f58e9c556ac816"} Nov 29 14:35:47 crc kubenswrapper[4907]: I1129 14:35:47.517221 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" event={"ID":"ec0e8965-93a7-46fa-9b06-a886fc03ad92","Type":"ContainerStarted","Data":"c8d16dc37fb238efdfe7c34d10b36f542746b4135d9990a6defc4d8e8b161e77"} Nov 29 14:35:47 crc kubenswrapper[4907]: I1129 14:35:47.529576 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-l59tk" podStartSLOduration=2.288724182 podStartE2EDuration="4.529556343s" podCreationTimestamp="2025-11-29 14:35:43 +0000 UTC" firstStartedPulling="2025-11-29 14:35:44.695007156 +0000 UTC m=+442.681844808" lastFinishedPulling="2025-11-29 14:35:46.935839297 +0000 UTC m=+444.922676969" observedRunningTime="2025-11-29 14:35:47.525637959 +0000 UTC m=+445.512475611" watchObservedRunningTime="2025-11-29 14:35:47.529556343 +0000 UTC m=+445.516394005" Nov 29 14:35:47 crc kubenswrapper[4907]: I1129 14:35:47.547176 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-8sf5p" podStartSLOduration=2.07115415 podStartE2EDuration="4.547152802s" podCreationTimestamp="2025-11-29 14:35:43 +0000 UTC" firstStartedPulling="2025-11-29 14:35:44.446846525 +0000 UTC m=+442.433684197" lastFinishedPulling="2025-11-29 14:35:46.922845197 +0000 UTC m=+444.909682849" observedRunningTime="2025-11-29 14:35:47.546657111 +0000 UTC m=+445.533494763" watchObservedRunningTime="2025-11-29 14:35:47.547152802 +0000 UTC m=+445.533990454" Nov 29 14:35:47 crc kubenswrapper[4907]: I1129 14:35:47.571529 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zhchg" podStartSLOduration=3.328417255 podStartE2EDuration="4.571494193s" podCreationTimestamp="2025-11-29 14:35:43 +0000 UTC" firstStartedPulling="2025-11-29 14:35:44.000032714 +0000 UTC m=+441.986870376" lastFinishedPulling="2025-11-29 14:35:45.243109662 +0000 UTC m=+443.229947314" observedRunningTime="2025-11-29 14:35:47.571219667 +0000 UTC m=+445.558057339" watchObservedRunningTime="2025-11-29 14:35:47.571494193 +0000 UTC m=+445.558331875" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.428693 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6857d79cbf-qntjc"] Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.430942 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.451115 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6857d79cbf-qntjc"] Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.523007 4907 generic.go:334] "Generic (PLEG): container finished" podID="cebfb025-6db9-479b-86e0-1f0044011f24" containerID="97444b6cf14cd1695990d1ceef40b64d79d42ad9887524fdee63cc247c09a0e8" exitCode=0 Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.523305 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cebfb025-6db9-479b-86e0-1f0044011f24","Type":"ContainerDied","Data":"97444b6cf14cd1695990d1ceef40b64d79d42ad9887524fdee63cc247c09a0e8"} Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.578614 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-oauth-serving-cert\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.578666 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e8125a1-06a4-413d-a191-76857064a187-console-oauth-config\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.578695 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czncc\" (UniqueName: \"kubernetes.io/projected/6e8125a1-06a4-413d-a191-76857064a187-kube-api-access-czncc\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.578730 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e8125a1-06a4-413d-a191-76857064a187-console-serving-cert\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.578750 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-service-ca\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.578774 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-console-config\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.578799 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-trusted-ca-bundle\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.680122 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-service-ca\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.680852 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-console-config\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.681122 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-trusted-ca-bundle\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.681579 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-oauth-serving-cert\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.681704 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e8125a1-06a4-413d-a191-76857064a187-console-oauth-config\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.681811 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czncc\" (UniqueName: \"kubernetes.io/projected/6e8125a1-06a4-413d-a191-76857064a187-kube-api-access-czncc\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.681858 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-service-ca\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.682001 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-console-config\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.682043 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e8125a1-06a4-413d-a191-76857064a187-console-serving-cert\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.682542 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-trusted-ca-bundle\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.683276 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-oauth-serving-cert\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.687692 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e8125a1-06a4-413d-a191-76857064a187-console-oauth-config\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.699920 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czncc\" (UniqueName: \"kubernetes.io/projected/6e8125a1-06a4-413d-a191-76857064a187-kube-api-access-czncc\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.703269 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e8125a1-06a4-413d-a191-76857064a187-console-serving-cert\") pod \"console-6857d79cbf-qntjc\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.750152 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.965602 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-57749cbfbd-jqfsm"] Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.966378 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.968851 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.969296 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-a24jmbbkk7bm0" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.969553 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.969675 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-nq87v" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.970428 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.970576 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.982896 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6857d79cbf-qntjc"] Nov 29 14:35:48 crc kubenswrapper[4907]: I1129 14:35:48.988234 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57749cbfbd-jqfsm"] Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.089591 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7760e7b9-ceed-473a-810f-c4f5dc02bb41-audit-log\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.089898 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7760e7b9-ceed-473a-810f-c4f5dc02bb41-metrics-server-audit-profiles\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.089920 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rfvt\" (UniqueName: \"kubernetes.io/projected/7760e7b9-ceed-473a-810f-c4f5dc02bb41-kube-api-access-8rfvt\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.089953 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7760e7b9-ceed-473a-810f-c4f5dc02bb41-client-ca-bundle\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.090179 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7760e7b9-ceed-473a-810f-c4f5dc02bb41-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.090268 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7760e7b9-ceed-473a-810f-c4f5dc02bb41-secret-metrics-server-tls\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.090314 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7760e7b9-ceed-473a-810f-c4f5dc02bb41-secret-metrics-client-certs\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.192720 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7760e7b9-ceed-473a-810f-c4f5dc02bb41-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.192801 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7760e7b9-ceed-473a-810f-c4f5dc02bb41-secret-metrics-server-tls\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.192840 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7760e7b9-ceed-473a-810f-c4f5dc02bb41-secret-metrics-client-certs\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.192896 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7760e7b9-ceed-473a-810f-c4f5dc02bb41-audit-log\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.192933 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7760e7b9-ceed-473a-810f-c4f5dc02bb41-metrics-server-audit-profiles\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.192954 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rfvt\" (UniqueName: \"kubernetes.io/projected/7760e7b9-ceed-473a-810f-c4f5dc02bb41-kube-api-access-8rfvt\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.192987 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7760e7b9-ceed-473a-810f-c4f5dc02bb41-client-ca-bundle\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.193859 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7760e7b9-ceed-473a-810f-c4f5dc02bb41-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.194352 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7760e7b9-ceed-473a-810f-c4f5dc02bb41-audit-log\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.195348 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7760e7b9-ceed-473a-810f-c4f5dc02bb41-metrics-server-audit-profiles\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.199064 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7760e7b9-ceed-473a-810f-c4f5dc02bb41-secret-metrics-client-certs\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.200873 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7760e7b9-ceed-473a-810f-c4f5dc02bb41-secret-metrics-server-tls\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.215955 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rfvt\" (UniqueName: \"kubernetes.io/projected/7760e7b9-ceed-473a-810f-c4f5dc02bb41-kube-api-access-8rfvt\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.219083 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7760e7b9-ceed-473a-810f-c4f5dc02bb41-client-ca-bundle\") pod \"metrics-server-57749cbfbd-jqfsm\" (UID: \"7760e7b9-ceed-473a-810f-c4f5dc02bb41\") " pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.290962 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.369855 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-6df995fb7f-d45dk"] Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.371348 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6df995fb7f-d45dk" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.373919 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.374422 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.375325 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6df995fb7f-d45dk"] Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.498218 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f5506da-1d9d-4e8a-ac74-03cc70419787-monitoring-plugin-cert\") pod \"monitoring-plugin-6df995fb7f-d45dk\" (UID: \"2f5506da-1d9d-4e8a-ac74-03cc70419787\") " pod="openshift-monitoring/monitoring-plugin-6df995fb7f-d45dk" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.531637 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6857d79cbf-qntjc" event={"ID":"6e8125a1-06a4-413d-a191-76857064a187","Type":"ContainerStarted","Data":"a3d8ae7f19dcd7f2b11807a316ada41e9a908e14aff939ab3a6b05cf1a0f59ec"} Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.531694 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6857d79cbf-qntjc" event={"ID":"6e8125a1-06a4-413d-a191-76857064a187","Type":"ContainerStarted","Data":"07402786aba9b62f0c0a21d7cf7b9624368599961bcfee5c5a3b97eecbf60898"} Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.555678 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6857d79cbf-qntjc" podStartSLOduration=1.555658612 podStartE2EDuration="1.555658612s" podCreationTimestamp="2025-11-29 14:35:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:35:49.551861792 +0000 UTC m=+447.538699454" watchObservedRunningTime="2025-11-29 14:35:49.555658612 +0000 UTC m=+447.542496264" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.600214 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f5506da-1d9d-4e8a-ac74-03cc70419787-monitoring-plugin-cert\") pod \"monitoring-plugin-6df995fb7f-d45dk\" (UID: \"2f5506da-1d9d-4e8a-ac74-03cc70419787\") " pod="openshift-monitoring/monitoring-plugin-6df995fb7f-d45dk" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.605012 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f5506da-1d9d-4e8a-ac74-03cc70419787-monitoring-plugin-cert\") pod \"monitoring-plugin-6df995fb7f-d45dk\" (UID: \"2f5506da-1d9d-4e8a-ac74-03cc70419787\") " pod="openshift-monitoring/monitoring-plugin-6df995fb7f-d45dk" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.692667 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6df995fb7f-d45dk" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.929840 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.932689 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.935924 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.936101 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.936597 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.936716 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.938080 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-1hejr66tl3iur" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.938651 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.939071 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.939561 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.949053 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.949178 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.949359 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-8z4pt" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.952265 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.953910 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Nov 29 14:35:49 crc kubenswrapper[4907]: I1129 14:35:49.964204 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.106239 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.106302 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-web-config\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.106340 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fedc5702-915e-487c-a140-2e1a86299542-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.106369 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.106394 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.106428 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fedc5702-915e-487c-a140-2e1a86299542-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.106511 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdgpq\" (UniqueName: \"kubernetes.io/projected/fedc5702-915e-487c-a140-2e1a86299542-kube-api-access-rdgpq\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.106534 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fedc5702-915e-487c-a140-2e1a86299542-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.106742 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.106811 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.106840 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-config\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.106872 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fedc5702-915e-487c-a140-2e1a86299542-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.107021 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fedc5702-915e-487c-a140-2e1a86299542-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.107054 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fedc5702-915e-487c-a140-2e1a86299542-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.107177 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.107243 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fedc5702-915e-487c-a140-2e1a86299542-config-out\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.107321 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.107353 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fedc5702-915e-487c-a140-2e1a86299542-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.208924 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-config\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.208989 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fedc5702-915e-487c-a140-2e1a86299542-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209027 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fedc5702-915e-487c-a140-2e1a86299542-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209044 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fedc5702-915e-487c-a140-2e1a86299542-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209073 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209096 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fedc5702-915e-487c-a140-2e1a86299542-config-out\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209121 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209143 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fedc5702-915e-487c-a140-2e1a86299542-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209174 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209209 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-web-config\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209236 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fedc5702-915e-487c-a140-2e1a86299542-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209262 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209283 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209309 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fedc5702-915e-487c-a140-2e1a86299542-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209339 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdgpq\" (UniqueName: \"kubernetes.io/projected/fedc5702-915e-487c-a140-2e1a86299542-kube-api-access-rdgpq\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209359 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fedc5702-915e-487c-a140-2e1a86299542-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209383 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.209403 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.211107 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fedc5702-915e-487c-a140-2e1a86299542-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.211677 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fedc5702-915e-487c-a140-2e1a86299542-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.211914 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fedc5702-915e-487c-a140-2e1a86299542-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.217731 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fedc5702-915e-487c-a140-2e1a86299542-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.218822 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fedc5702-915e-487c-a140-2e1a86299542-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.222072 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.222508 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fedc5702-915e-487c-a140-2e1a86299542-config-out\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.222525 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-web-config\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.222616 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.226488 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.226557 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fedc5702-915e-487c-a140-2e1a86299542-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.227330 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fedc5702-915e-487c-a140-2e1a86299542-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.227367 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.227743 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.228525 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.228803 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.231759 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fedc5702-915e-487c-a140-2e1a86299542-config\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.232340 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdgpq\" (UniqueName: \"kubernetes.io/projected/fedc5702-915e-487c-a140-2e1a86299542-kube-api-access-rdgpq\") pod \"prometheus-k8s-0\" (UID: \"fedc5702-915e-487c-a140-2e1a86299542\") " pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.259901 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.474791 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57749cbfbd-jqfsm"] Nov 29 14:35:50 crc kubenswrapper[4907]: I1129 14:35:50.491152 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6df995fb7f-d45dk"] Nov 29 14:35:50 crc kubenswrapper[4907]: W1129 14:35:50.822869 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f5506da_1d9d_4e8a_ac74_03cc70419787.slice/crio-6a1923e7b9dcbcebc74374e81de451a5491497efcfb3198021ed7f067390e205 WatchSource:0}: Error finding container 6a1923e7b9dcbcebc74374e81de451a5491497efcfb3198021ed7f067390e205: Status 404 returned error can't find the container with id 6a1923e7b9dcbcebc74374e81de451a5491497efcfb3198021ed7f067390e205 Nov 29 14:35:50 crc kubenswrapper[4907]: W1129 14:35:50.824941 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7760e7b9_ceed_473a_810f_c4f5dc02bb41.slice/crio-698db61bee6e7742057c4c25c32e4dcb5720930bb4c6658f8418299345cf4b7c WatchSource:0}: Error finding container 698db61bee6e7742057c4c25c32e4dcb5720930bb4c6658f8418299345cf4b7c: Status 404 returned error can't find the container with id 698db61bee6e7742057c4c25c32e4dcb5720930bb4c6658f8418299345cf4b7c Nov 29 14:35:51 crc kubenswrapper[4907]: I1129 14:35:51.382880 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Nov 29 14:35:51 crc kubenswrapper[4907]: I1129 14:35:51.560793 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6df995fb7f-d45dk" event={"ID":"2f5506da-1d9d-4e8a-ac74-03cc70419787","Type":"ContainerStarted","Data":"6a1923e7b9dcbcebc74374e81de451a5491497efcfb3198021ed7f067390e205"} Nov 29 14:35:51 crc kubenswrapper[4907]: I1129 14:35:51.565029 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" event={"ID":"7760e7b9-ceed-473a-810f-c4f5dc02bb41","Type":"ContainerStarted","Data":"698db61bee6e7742057c4c25c32e4dcb5720930bb4c6658f8418299345cf4b7c"} Nov 29 14:35:51 crc kubenswrapper[4907]: I1129 14:35:51.568806 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cebfb025-6db9-479b-86e0-1f0044011f24","Type":"ContainerStarted","Data":"e375e32e8bf79967771f96948e4bf85e638cb5054feb4aa71b42119e92ea234e"} Nov 29 14:35:51 crc kubenswrapper[4907]: I1129 14:35:51.568843 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cebfb025-6db9-479b-86e0-1f0044011f24","Type":"ContainerStarted","Data":"3dbebe8947e107b6ade7b48bf49eedea0bfc491288e537d4caff432729739d4c"} Nov 29 14:35:51 crc kubenswrapper[4907]: I1129 14:35:51.568858 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cebfb025-6db9-479b-86e0-1f0044011f24","Type":"ContainerStarted","Data":"db875573691ce834e01320c6c59dcb41d17949689c25d04340417fe4d056548c"} Nov 29 14:35:51 crc kubenswrapper[4907]: I1129 14:35:51.571577 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fedc5702-915e-487c-a140-2e1a86299542","Type":"ContainerStarted","Data":"de50446a1af77671ef7008a0649c6c5adacbc7691b997f36905f03f48e00a55c"} Nov 29 14:35:51 crc kubenswrapper[4907]: I1129 14:35:51.577296 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" event={"ID":"5478c134-753d-455a-9b4a-cfa5aeb41239","Type":"ContainerStarted","Data":"21d4023e0209c49e77734bfcd25b8b0477e6880bb9dc557f992860ab0c5abc9c"} Nov 29 14:35:51 crc kubenswrapper[4907]: I1129 14:35:51.577366 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" event={"ID":"5478c134-753d-455a-9b4a-cfa5aeb41239","Type":"ContainerStarted","Data":"91593be0ff3a2b9c64bb91cdd47f502a6439b8da813cfdf67f9c7bcda9b0d16e"} Nov 29 14:35:51 crc kubenswrapper[4907]: I1129 14:35:51.577390 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" event={"ID":"5478c134-753d-455a-9b4a-cfa5aeb41239","Type":"ContainerStarted","Data":"11d8eccb406144e525ba4fc1491fc0d13a31e7251411abd09014a4439241f1b4"} Nov 29 14:35:52 crc kubenswrapper[4907]: I1129 14:35:52.587097 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cebfb025-6db9-479b-86e0-1f0044011f24","Type":"ContainerStarted","Data":"11fa0650b7dd431094b22bd40a9a14540e355df2ce4c09f227e7d1fc4566b29c"} Nov 29 14:35:52 crc kubenswrapper[4907]: I1129 14:35:52.587364 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cebfb025-6db9-479b-86e0-1f0044011f24","Type":"ContainerStarted","Data":"d40528084b95c326f8464fcbdc62379572f94065cb33890b5c371990883b9bd9"} Nov 29 14:35:52 crc kubenswrapper[4907]: I1129 14:35:52.588261 4907 generic.go:334] "Generic (PLEG): container finished" podID="fedc5702-915e-487c-a140-2e1a86299542" containerID="0f70c8a92533d9af1d9f3ce84923abe15a48ac4297604e1aa916799ba916373d" exitCode=0 Nov 29 14:35:52 crc kubenswrapper[4907]: I1129 14:35:52.588297 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fedc5702-915e-487c-a140-2e1a86299542","Type":"ContainerDied","Data":"0f70c8a92533d9af1d9f3ce84923abe15a48ac4297604e1aa916799ba916373d"} Nov 29 14:35:54 crc kubenswrapper[4907]: I1129 14:35:54.605187 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" event={"ID":"7760e7b9-ceed-473a-810f-c4f5dc02bb41","Type":"ContainerStarted","Data":"29f17605b49840bb3adb6223a935ade04f1d11f955e1fd5c8b8441350b9d5464"} Nov 29 14:35:54 crc kubenswrapper[4907]: I1129 14:35:54.612936 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cebfb025-6db9-479b-86e0-1f0044011f24","Type":"ContainerStarted","Data":"861d35bedb84b2e076df79c38df8574752b65385a10c12d3e624751753ebc46f"} Nov 29 14:35:54 crc kubenswrapper[4907]: I1129 14:35:54.616745 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" event={"ID":"5478c134-753d-455a-9b4a-cfa5aeb41239","Type":"ContainerStarted","Data":"5dfa7f4a3c98b9787bf82ead47ede5672565c53b371b3f703dc8c7a543a74219"} Nov 29 14:35:54 crc kubenswrapper[4907]: I1129 14:35:54.616786 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" event={"ID":"5478c134-753d-455a-9b4a-cfa5aeb41239","Type":"ContainerStarted","Data":"e1302315a7e6503957400c2b77f1a97a8bba0eacdd6a87851bc6ed0a3accf74c"} Nov 29 14:35:54 crc kubenswrapper[4907]: I1129 14:35:54.616825 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" event={"ID":"5478c134-753d-455a-9b4a-cfa5aeb41239","Type":"ContainerStarted","Data":"e0ca27630bce9c0d229e568eebd462f58163aaaa122ed5634e31976c4acf12b5"} Nov 29 14:35:54 crc kubenswrapper[4907]: I1129 14:35:54.617805 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:54 crc kubenswrapper[4907]: I1129 14:35:54.621812 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6df995fb7f-d45dk" event={"ID":"2f5506da-1d9d-4e8a-ac74-03cc70419787","Type":"ContainerStarted","Data":"50bf258ce631d72f6b9e7d43f604b52927e8eb9864dd1b0cfc6242ea44672fa8"} Nov 29 14:35:54 crc kubenswrapper[4907]: I1129 14:35:54.626401 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-6df995fb7f-d45dk" Nov 29 14:35:54 crc kubenswrapper[4907]: I1129 14:35:54.637726 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6df995fb7f-d45dk" Nov 29 14:35:54 crc kubenswrapper[4907]: I1129 14:35:54.646347 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" podStartSLOduration=4.559310195 podStartE2EDuration="6.644166516s" podCreationTimestamp="2025-11-29 14:35:48 +0000 UTC" firstStartedPulling="2025-11-29 14:35:50.834007006 +0000 UTC m=+448.820844658" lastFinishedPulling="2025-11-29 14:35:52.918863327 +0000 UTC m=+450.905700979" observedRunningTime="2025-11-29 14:35:54.638227927 +0000 UTC m=+452.625065609" watchObservedRunningTime="2025-11-29 14:35:54.644166516 +0000 UTC m=+452.631004188" Nov 29 14:35:54 crc kubenswrapper[4907]: I1129 14:35:54.668211 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" podStartSLOduration=3.504182605 podStartE2EDuration="9.668182699s" podCreationTimestamp="2025-11-29 14:35:45 +0000 UTC" firstStartedPulling="2025-11-29 14:35:47.345606644 +0000 UTC m=+445.332444296" lastFinishedPulling="2025-11-29 14:35:53.509606738 +0000 UTC m=+451.496444390" observedRunningTime="2025-11-29 14:35:54.667707535 +0000 UTC m=+452.654545227" watchObservedRunningTime="2025-11-29 14:35:54.668182699 +0000 UTC m=+452.655020381" Nov 29 14:35:54 crc kubenswrapper[4907]: I1129 14:35:54.738378 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.892229519 podStartE2EDuration="10.738344432s" podCreationTimestamp="2025-11-29 14:35:44 +0000 UTC" firstStartedPulling="2025-11-29 14:35:45.664600889 +0000 UTC m=+443.651438541" lastFinishedPulling="2025-11-29 14:35:53.510715802 +0000 UTC m=+451.497553454" observedRunningTime="2025-11-29 14:35:54.702178153 +0000 UTC m=+452.689015805" watchObservedRunningTime="2025-11-29 14:35:54.738344432 +0000 UTC m=+452.725182094" Nov 29 14:35:54 crc kubenswrapper[4907]: I1129 14:35:54.741277 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-6df995fb7f-d45dk" podStartSLOduration=3.665646369 podStartE2EDuration="5.741267601s" podCreationTimestamp="2025-11-29 14:35:49 +0000 UTC" firstStartedPulling="2025-11-29 14:35:50.833262544 +0000 UTC m=+448.820100196" lastFinishedPulling="2025-11-29 14:35:52.908883776 +0000 UTC m=+450.895721428" observedRunningTime="2025-11-29 14:35:54.736884389 +0000 UTC m=+452.723722061" watchObservedRunningTime="2025-11-29 14:35:54.741267601 +0000 UTC m=+452.728105253" Nov 29 14:35:56 crc kubenswrapper[4907]: I1129 14:35:56.090535 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6fc999d58d-xndt8" Nov 29 14:35:57 crc kubenswrapper[4907]: I1129 14:35:57.656594 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fedc5702-915e-487c-a140-2e1a86299542","Type":"ContainerStarted","Data":"afb77cf471257eb4c7952ac577ca9c75a700f1f8554932307c0a67bade9f4adc"} Nov 29 14:35:57 crc kubenswrapper[4907]: I1129 14:35:57.657017 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fedc5702-915e-487c-a140-2e1a86299542","Type":"ContainerStarted","Data":"789fb7f0434ae5fe6b648b6fb69ad863da8c8cb9085b68dc7aa5ada4e195e759"} Nov 29 14:35:58 crc kubenswrapper[4907]: I1129 14:35:58.671065 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fedc5702-915e-487c-a140-2e1a86299542","Type":"ContainerStarted","Data":"d2a7c5b09ef685548334e145f7d0b13340a8e8063f86f28422e0da233f257ca4"} Nov 29 14:35:58 crc kubenswrapper[4907]: I1129 14:35:58.671482 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fedc5702-915e-487c-a140-2e1a86299542","Type":"ContainerStarted","Data":"f65a34de4c98eab1cf58aff4529f66b1e640cb5db9203a924fadfd59ed4913be"} Nov 29 14:35:58 crc kubenswrapper[4907]: I1129 14:35:58.671505 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fedc5702-915e-487c-a140-2e1a86299542","Type":"ContainerStarted","Data":"7a60a9be22962ffc7fa0ff390ebcd6a6cb467fe6109c0d0e48472a6b87f6f054"} Nov 29 14:35:58 crc kubenswrapper[4907]: I1129 14:35:58.671523 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fedc5702-915e-487c-a140-2e1a86299542","Type":"ContainerStarted","Data":"9f5369cb00fc67aa5d1acf95e41990873f0374d998999cc981a634834036ad62"} Nov 29 14:35:58 crc kubenswrapper[4907]: I1129 14:35:58.739079 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.180722338 podStartE2EDuration="9.73904507s" podCreationTimestamp="2025-11-29 14:35:49 +0000 UTC" firstStartedPulling="2025-11-29 14:35:52.591826567 +0000 UTC m=+450.578664219" lastFinishedPulling="2025-11-29 14:35:57.150149299 +0000 UTC m=+455.136986951" observedRunningTime="2025-11-29 14:35:58.721270055 +0000 UTC m=+456.708107747" watchObservedRunningTime="2025-11-29 14:35:58.73904507 +0000 UTC m=+456.725882762" Nov 29 14:35:58 crc kubenswrapper[4907]: I1129 14:35:58.750885 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:58 crc kubenswrapper[4907]: I1129 14:35:58.750944 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:58 crc kubenswrapper[4907]: I1129 14:35:58.757758 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:59 crc kubenswrapper[4907]: I1129 14:35:59.686270 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:35:59 crc kubenswrapper[4907]: I1129 14:35:59.802228 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5t44g"] Nov 29 14:36:00 crc kubenswrapper[4907]: I1129 14:36:00.260562 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:36:09 crc kubenswrapper[4907]: I1129 14:36:09.291626 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:36:09 crc kubenswrapper[4907]: I1129 14:36:09.292252 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:36:24 crc kubenswrapper[4907]: I1129 14:36:24.874874 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-5t44g" podUID="0c70da9a-ff96-432f-81ad-382c70754e70" containerName="console" containerID="cri-o://7e8a2351b792551a738eeeb169cd93621d9652f6e39873e6c447a35fefb1da6d" gracePeriod=15 Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.861745 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5t44g_0c70da9a-ff96-432f-81ad-382c70754e70/console/0.log" Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.862109 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.903293 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5t44g_0c70da9a-ff96-432f-81ad-382c70754e70/console/0.log" Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.903350 4907 generic.go:334] "Generic (PLEG): container finished" podID="0c70da9a-ff96-432f-81ad-382c70754e70" containerID="7e8a2351b792551a738eeeb169cd93621d9652f6e39873e6c447a35fefb1da6d" exitCode=2 Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.903376 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5t44g" event={"ID":"0c70da9a-ff96-432f-81ad-382c70754e70","Type":"ContainerDied","Data":"7e8a2351b792551a738eeeb169cd93621d9652f6e39873e6c447a35fefb1da6d"} Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.903397 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5t44g" event={"ID":"0c70da9a-ff96-432f-81ad-382c70754e70","Type":"ContainerDied","Data":"07c8cf20bbec9cd35ce2496b5a307d905e7de091190a7992d5c49f33ddcc2dd3"} Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.903425 4907 scope.go:117] "RemoveContainer" containerID="7e8a2351b792551a738eeeb169cd93621d9652f6e39873e6c447a35fefb1da6d" Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.903555 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5t44g" Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.929289 4907 scope.go:117] "RemoveContainer" containerID="7e8a2351b792551a738eeeb169cd93621d9652f6e39873e6c447a35fefb1da6d" Nov 29 14:36:25 crc kubenswrapper[4907]: E1129 14:36:25.929812 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8a2351b792551a738eeeb169cd93621d9652f6e39873e6c447a35fefb1da6d\": container with ID starting with 7e8a2351b792551a738eeeb169cd93621d9652f6e39873e6c447a35fefb1da6d not found: ID does not exist" containerID="7e8a2351b792551a738eeeb169cd93621d9652f6e39873e6c447a35fefb1da6d" Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.929862 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8a2351b792551a738eeeb169cd93621d9652f6e39873e6c447a35fefb1da6d"} err="failed to get container status \"7e8a2351b792551a738eeeb169cd93621d9652f6e39873e6c447a35fefb1da6d\": rpc error: code = NotFound desc = could not find container \"7e8a2351b792551a738eeeb169cd93621d9652f6e39873e6c447a35fefb1da6d\": container with ID starting with 7e8a2351b792551a738eeeb169cd93621d9652f6e39873e6c447a35fefb1da6d not found: ID does not exist" Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.992776 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-trusted-ca-bundle\") pod \"0c70da9a-ff96-432f-81ad-382c70754e70\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.992814 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c70da9a-ff96-432f-81ad-382c70754e70-console-serving-cert\") pod \"0c70da9a-ff96-432f-81ad-382c70754e70\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.993794 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-service-ca\") pod \"0c70da9a-ff96-432f-81ad-382c70754e70\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.993849 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0c70da9a-ff96-432f-81ad-382c70754e70" (UID: "0c70da9a-ff96-432f-81ad-382c70754e70"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.993909 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd7f8\" (UniqueName: \"kubernetes.io/projected/0c70da9a-ff96-432f-81ad-382c70754e70-kube-api-access-bd7f8\") pod \"0c70da9a-ff96-432f-81ad-382c70754e70\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.993956 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-console-config\") pod \"0c70da9a-ff96-432f-81ad-382c70754e70\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.994152 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0c70da9a-ff96-432f-81ad-382c70754e70-console-oauth-config\") pod \"0c70da9a-ff96-432f-81ad-382c70754e70\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.994196 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-oauth-serving-cert\") pod \"0c70da9a-ff96-432f-81ad-382c70754e70\" (UID: \"0c70da9a-ff96-432f-81ad-382c70754e70\") " Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.994317 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-service-ca" (OuterVolumeSpecName: "service-ca") pod "0c70da9a-ff96-432f-81ad-382c70754e70" (UID: "0c70da9a-ff96-432f-81ad-382c70754e70"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.995046 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.995108 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.995077 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-console-config" (OuterVolumeSpecName: "console-config") pod "0c70da9a-ff96-432f-81ad-382c70754e70" (UID: "0c70da9a-ff96-432f-81ad-382c70754e70"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:36:25 crc kubenswrapper[4907]: I1129 14:36:25.995729 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0c70da9a-ff96-432f-81ad-382c70754e70" (UID: "0c70da9a-ff96-432f-81ad-382c70754e70"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:36:26 crc kubenswrapper[4907]: I1129 14:36:26.002905 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c70da9a-ff96-432f-81ad-382c70754e70-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0c70da9a-ff96-432f-81ad-382c70754e70" (UID: "0c70da9a-ff96-432f-81ad-382c70754e70"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:36:26 crc kubenswrapper[4907]: I1129 14:36:26.003116 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c70da9a-ff96-432f-81ad-382c70754e70-kube-api-access-bd7f8" (OuterVolumeSpecName: "kube-api-access-bd7f8") pod "0c70da9a-ff96-432f-81ad-382c70754e70" (UID: "0c70da9a-ff96-432f-81ad-382c70754e70"). InnerVolumeSpecName "kube-api-access-bd7f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:36:26 crc kubenswrapper[4907]: I1129 14:36:26.003923 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c70da9a-ff96-432f-81ad-382c70754e70-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0c70da9a-ff96-432f-81ad-382c70754e70" (UID: "0c70da9a-ff96-432f-81ad-382c70754e70"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:36:26 crc kubenswrapper[4907]: I1129 14:36:26.096479 4907 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0c70da9a-ff96-432f-81ad-382c70754e70-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:36:26 crc kubenswrapper[4907]: I1129 14:36:26.096538 4907 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:36:26 crc kubenswrapper[4907]: I1129 14:36:26.096561 4907 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c70da9a-ff96-432f-81ad-382c70754e70-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:36:26 crc kubenswrapper[4907]: I1129 14:36:26.096579 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd7f8\" (UniqueName: \"kubernetes.io/projected/0c70da9a-ff96-432f-81ad-382c70754e70-kube-api-access-bd7f8\") on node \"crc\" DevicePath \"\"" Nov 29 14:36:26 crc kubenswrapper[4907]: I1129 14:36:26.096601 4907 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0c70da9a-ff96-432f-81ad-382c70754e70-console-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:36:26 crc kubenswrapper[4907]: I1129 14:36:26.259516 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5t44g"] Nov 29 14:36:26 crc kubenswrapper[4907]: I1129 14:36:26.267152 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-5t44g"] Nov 29 14:36:26 crc kubenswrapper[4907]: I1129 14:36:26.490148 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c70da9a-ff96-432f-81ad-382c70754e70" path="/var/lib/kubelet/pods/0c70da9a-ff96-432f-81ad-382c70754e70/volumes" Nov 29 14:36:29 crc kubenswrapper[4907]: I1129 14:36:29.304629 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:36:29 crc kubenswrapper[4907]: I1129 14:36:29.313961 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-57749cbfbd-jqfsm" Nov 29 14:36:50 crc kubenswrapper[4907]: I1129 14:36:50.260970 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:36:50 crc kubenswrapper[4907]: I1129 14:36:50.315673 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:36:51 crc kubenswrapper[4907]: I1129 14:36:51.159076 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.226547 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5db8778c9-qk2w4"] Nov 29 14:37:03 crc kubenswrapper[4907]: E1129 14:37:03.231660 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c70da9a-ff96-432f-81ad-382c70754e70" containerName="console" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.231698 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c70da9a-ff96-432f-81ad-382c70754e70" containerName="console" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.231886 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c70da9a-ff96-432f-81ad-382c70754e70" containerName="console" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.232498 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.248132 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5db8778c9-qk2w4"] Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.341303 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-console-config\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.341349 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a94fe315-0819-408a-948e-ea4ce03dde60-console-oauth-config\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.341607 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-service-ca\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.341652 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a94fe315-0819-408a-948e-ea4ce03dde60-console-serving-cert\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.341803 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-trusted-ca-bundle\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.341914 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4knkk\" (UniqueName: \"kubernetes.io/projected/a94fe315-0819-408a-948e-ea4ce03dde60-kube-api-access-4knkk\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.341941 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-oauth-serving-cert\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.443077 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-trusted-ca-bundle\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.443187 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4knkk\" (UniqueName: \"kubernetes.io/projected/a94fe315-0819-408a-948e-ea4ce03dde60-kube-api-access-4knkk\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.443229 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-oauth-serving-cert\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.443289 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-console-config\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.443331 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a94fe315-0819-408a-948e-ea4ce03dde60-console-oauth-config\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.443467 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-service-ca\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.443505 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a94fe315-0819-408a-948e-ea4ce03dde60-console-serving-cert\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.444611 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-console-config\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.444646 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-trusted-ca-bundle\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.445153 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-oauth-serving-cert\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.446126 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-service-ca\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.450856 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a94fe315-0819-408a-948e-ea4ce03dde60-console-oauth-config\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.451690 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a94fe315-0819-408a-948e-ea4ce03dde60-console-serving-cert\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.468502 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4knkk\" (UniqueName: \"kubernetes.io/projected/a94fe315-0819-408a-948e-ea4ce03dde60-kube-api-access-4knkk\") pod \"console-5db8778c9-qk2w4\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.559302 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:03 crc kubenswrapper[4907]: I1129 14:37:03.866692 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5db8778c9-qk2w4"] Nov 29 14:37:04 crc kubenswrapper[4907]: I1129 14:37:04.224969 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5db8778c9-qk2w4" event={"ID":"a94fe315-0819-408a-948e-ea4ce03dde60","Type":"ContainerStarted","Data":"c0b07e55d49d34b8a71b9a3d9185fcd1e2a66060042255152c4ee8c991c944d2"} Nov 29 14:37:04 crc kubenswrapper[4907]: I1129 14:37:04.225037 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5db8778c9-qk2w4" event={"ID":"a94fe315-0819-408a-948e-ea4ce03dde60","Type":"ContainerStarted","Data":"4ed6ba3028f60707113b0892a6ef2730d6501c43e5a7385fd25428c718e09236"} Nov 29 14:37:04 crc kubenswrapper[4907]: I1129 14:37:04.260417 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5db8778c9-qk2w4" podStartSLOduration=1.260390732 podStartE2EDuration="1.260390732s" podCreationTimestamp="2025-11-29 14:37:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:37:04.25856576 +0000 UTC m=+522.245403452" watchObservedRunningTime="2025-11-29 14:37:04.260390732 +0000 UTC m=+522.247228424" Nov 29 14:37:13 crc kubenswrapper[4907]: I1129 14:37:13.559665 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:13 crc kubenswrapper[4907]: I1129 14:37:13.560535 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:13 crc kubenswrapper[4907]: I1129 14:37:13.570155 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:14 crc kubenswrapper[4907]: I1129 14:37:14.334893 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:37:14 crc kubenswrapper[4907]: I1129 14:37:14.419299 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6857d79cbf-qntjc"] Nov 29 14:37:39 crc kubenswrapper[4907]: I1129 14:37:39.491651 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6857d79cbf-qntjc" podUID="6e8125a1-06a4-413d-a191-76857064a187" containerName="console" containerID="cri-o://a3d8ae7f19dcd7f2b11807a316ada41e9a908e14aff939ab3a6b05cf1a0f59ec" gracePeriod=15 Nov 29 14:37:39 crc kubenswrapper[4907]: I1129 14:37:39.996337 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6857d79cbf-qntjc_6e8125a1-06a4-413d-a191-76857064a187/console/0.log" Nov 29 14:37:39 crc kubenswrapper[4907]: I1129 14:37:39.996901 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.029955 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e8125a1-06a4-413d-a191-76857064a187-console-oauth-config\") pod \"6e8125a1-06a4-413d-a191-76857064a187\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.030088 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e8125a1-06a4-413d-a191-76857064a187-console-serving-cert\") pod \"6e8125a1-06a4-413d-a191-76857064a187\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.030140 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-console-config\") pod \"6e8125a1-06a4-413d-a191-76857064a187\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.030198 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czncc\" (UniqueName: \"kubernetes.io/projected/6e8125a1-06a4-413d-a191-76857064a187-kube-api-access-czncc\") pod \"6e8125a1-06a4-413d-a191-76857064a187\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.030342 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-oauth-serving-cert\") pod \"6e8125a1-06a4-413d-a191-76857064a187\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.030403 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-trusted-ca-bundle\") pod \"6e8125a1-06a4-413d-a191-76857064a187\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.030505 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-service-ca\") pod \"6e8125a1-06a4-413d-a191-76857064a187\" (UID: \"6e8125a1-06a4-413d-a191-76857064a187\") " Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.031727 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6e8125a1-06a4-413d-a191-76857064a187" (UID: "6e8125a1-06a4-413d-a191-76857064a187"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.031797 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6e8125a1-06a4-413d-a191-76857064a187" (UID: "6e8125a1-06a4-413d-a191-76857064a187"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.031893 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-service-ca" (OuterVolumeSpecName: "service-ca") pod "6e8125a1-06a4-413d-a191-76857064a187" (UID: "6e8125a1-06a4-413d-a191-76857064a187"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.032371 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-console-config" (OuterVolumeSpecName: "console-config") pod "6e8125a1-06a4-413d-a191-76857064a187" (UID: "6e8125a1-06a4-413d-a191-76857064a187"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.041376 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e8125a1-06a4-413d-a191-76857064a187-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6e8125a1-06a4-413d-a191-76857064a187" (UID: "6e8125a1-06a4-413d-a191-76857064a187"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.041530 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e8125a1-06a4-413d-a191-76857064a187-kube-api-access-czncc" (OuterVolumeSpecName: "kube-api-access-czncc") pod "6e8125a1-06a4-413d-a191-76857064a187" (UID: "6e8125a1-06a4-413d-a191-76857064a187"). InnerVolumeSpecName "kube-api-access-czncc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.041607 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e8125a1-06a4-413d-a191-76857064a187-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6e8125a1-06a4-413d-a191-76857064a187" (UID: "6e8125a1-06a4-413d-a191-76857064a187"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.132658 4907 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e8125a1-06a4-413d-a191-76857064a187-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.132714 4907 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-console-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.132734 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czncc\" (UniqueName: \"kubernetes.io/projected/6e8125a1-06a4-413d-a191-76857064a187-kube-api-access-czncc\") on node \"crc\" DevicePath \"\"" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.132754 4907 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.132771 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.132788 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e8125a1-06a4-413d-a191-76857064a187-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.132805 4907 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e8125a1-06a4-413d-a191-76857064a187-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.561555 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6857d79cbf-qntjc_6e8125a1-06a4-413d-a191-76857064a187/console/0.log" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.561617 4907 generic.go:334] "Generic (PLEG): container finished" podID="6e8125a1-06a4-413d-a191-76857064a187" containerID="a3d8ae7f19dcd7f2b11807a316ada41e9a908e14aff939ab3a6b05cf1a0f59ec" exitCode=2 Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.561660 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6857d79cbf-qntjc" event={"ID":"6e8125a1-06a4-413d-a191-76857064a187","Type":"ContainerDied","Data":"a3d8ae7f19dcd7f2b11807a316ada41e9a908e14aff939ab3a6b05cf1a0f59ec"} Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.561716 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6857d79cbf-qntjc" event={"ID":"6e8125a1-06a4-413d-a191-76857064a187","Type":"ContainerDied","Data":"07402786aba9b62f0c0a21d7cf7b9624368599961bcfee5c5a3b97eecbf60898"} Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.561715 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6857d79cbf-qntjc" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.561838 4907 scope.go:117] "RemoveContainer" containerID="a3d8ae7f19dcd7f2b11807a316ada41e9a908e14aff939ab3a6b05cf1a0f59ec" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.593323 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6857d79cbf-qntjc"] Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.596284 4907 scope.go:117] "RemoveContainer" containerID="a3d8ae7f19dcd7f2b11807a316ada41e9a908e14aff939ab3a6b05cf1a0f59ec" Nov 29 14:37:40 crc kubenswrapper[4907]: E1129 14:37:40.597159 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3d8ae7f19dcd7f2b11807a316ada41e9a908e14aff939ab3a6b05cf1a0f59ec\": container with ID starting with a3d8ae7f19dcd7f2b11807a316ada41e9a908e14aff939ab3a6b05cf1a0f59ec not found: ID does not exist" containerID="a3d8ae7f19dcd7f2b11807a316ada41e9a908e14aff939ab3a6b05cf1a0f59ec" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.597219 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3d8ae7f19dcd7f2b11807a316ada41e9a908e14aff939ab3a6b05cf1a0f59ec"} err="failed to get container status \"a3d8ae7f19dcd7f2b11807a316ada41e9a908e14aff939ab3a6b05cf1a0f59ec\": rpc error: code = NotFound desc = could not find container \"a3d8ae7f19dcd7f2b11807a316ada41e9a908e14aff939ab3a6b05cf1a0f59ec\": container with ID starting with a3d8ae7f19dcd7f2b11807a316ada41e9a908e14aff939ab3a6b05cf1a0f59ec not found: ID does not exist" Nov 29 14:37:40 crc kubenswrapper[4907]: I1129 14:37:40.602505 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6857d79cbf-qntjc"] Nov 29 14:37:42 crc kubenswrapper[4907]: I1129 14:37:42.495485 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e8125a1-06a4-413d-a191-76857064a187" path="/var/lib/kubelet/pods/6e8125a1-06a4-413d-a191-76857064a187/volumes" Nov 29 14:37:58 crc kubenswrapper[4907]: I1129 14:37:58.490324 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:37:58 crc kubenswrapper[4907]: I1129 14:37:58.492632 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:38:28 crc kubenswrapper[4907]: I1129 14:38:28.491180 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:38:28 crc kubenswrapper[4907]: I1129 14:38:28.491988 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:38:58 crc kubenswrapper[4907]: I1129 14:38:58.496916 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:38:58 crc kubenswrapper[4907]: I1129 14:38:58.497690 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:38:58 crc kubenswrapper[4907]: I1129 14:38:58.497749 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:38:58 crc kubenswrapper[4907]: I1129 14:38:58.498698 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6266188cd3801cb79e9076cc411ccc7b4b18d94f48d528ad87b448fafd9cdc7d"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 14:38:58 crc kubenswrapper[4907]: I1129 14:38:58.498753 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://6266188cd3801cb79e9076cc411ccc7b4b18d94f48d528ad87b448fafd9cdc7d" gracePeriod=600 Nov 29 14:38:59 crc kubenswrapper[4907]: I1129 14:38:59.256336 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="6266188cd3801cb79e9076cc411ccc7b4b18d94f48d528ad87b448fafd9cdc7d" exitCode=0 Nov 29 14:38:59 crc kubenswrapper[4907]: I1129 14:38:59.256430 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"6266188cd3801cb79e9076cc411ccc7b4b18d94f48d528ad87b448fafd9cdc7d"} Nov 29 14:38:59 crc kubenswrapper[4907]: I1129 14:38:59.257692 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"f6d0279b6c0a0b7cac049f6991025c0a86d66be3a78b2d46b37cde84b40abbc6"} Nov 29 14:38:59 crc kubenswrapper[4907]: I1129 14:38:59.257785 4907 scope.go:117] "RemoveContainer" containerID="a456b93cdbff1001e9ff31e71e560207b63f5cbe6f442049caf8634aa78242ee" Nov 29 14:40:12 crc kubenswrapper[4907]: I1129 14:40:12.690233 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn"] Nov 29 14:40:12 crc kubenswrapper[4907]: E1129 14:40:12.691210 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e8125a1-06a4-413d-a191-76857064a187" containerName="console" Nov 29 14:40:12 crc kubenswrapper[4907]: I1129 14:40:12.691226 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e8125a1-06a4-413d-a191-76857064a187" containerName="console" Nov 29 14:40:12 crc kubenswrapper[4907]: I1129 14:40:12.691379 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e8125a1-06a4-413d-a191-76857064a187" containerName="console" Nov 29 14:40:12 crc kubenswrapper[4907]: I1129 14:40:12.692365 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" Nov 29 14:40:12 crc kubenswrapper[4907]: I1129 14:40:12.695998 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 29 14:40:12 crc kubenswrapper[4907]: I1129 14:40:12.712069 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn"] Nov 29 14:40:12 crc kubenswrapper[4907]: I1129 14:40:12.835518 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90303a73-fb9d-454b-a241-ffacdb554862-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn\" (UID: \"90303a73-fb9d-454b-a241-ffacdb554862\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" Nov 29 14:40:12 crc kubenswrapper[4907]: I1129 14:40:12.835611 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvjc4\" (UniqueName: \"kubernetes.io/projected/90303a73-fb9d-454b-a241-ffacdb554862-kube-api-access-qvjc4\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn\" (UID: \"90303a73-fb9d-454b-a241-ffacdb554862\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" Nov 29 14:40:12 crc kubenswrapper[4907]: I1129 14:40:12.835661 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90303a73-fb9d-454b-a241-ffacdb554862-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn\" (UID: \"90303a73-fb9d-454b-a241-ffacdb554862\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" Nov 29 14:40:12 crc kubenswrapper[4907]: I1129 14:40:12.936459 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvjc4\" (UniqueName: \"kubernetes.io/projected/90303a73-fb9d-454b-a241-ffacdb554862-kube-api-access-qvjc4\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn\" (UID: \"90303a73-fb9d-454b-a241-ffacdb554862\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" Nov 29 14:40:12 crc kubenswrapper[4907]: I1129 14:40:12.936544 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90303a73-fb9d-454b-a241-ffacdb554862-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn\" (UID: \"90303a73-fb9d-454b-a241-ffacdb554862\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" Nov 29 14:40:12 crc kubenswrapper[4907]: I1129 14:40:12.936593 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90303a73-fb9d-454b-a241-ffacdb554862-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn\" (UID: \"90303a73-fb9d-454b-a241-ffacdb554862\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" Nov 29 14:40:12 crc kubenswrapper[4907]: I1129 14:40:12.937214 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90303a73-fb9d-454b-a241-ffacdb554862-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn\" (UID: \"90303a73-fb9d-454b-a241-ffacdb554862\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" Nov 29 14:40:12 crc kubenswrapper[4907]: I1129 14:40:12.937228 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90303a73-fb9d-454b-a241-ffacdb554862-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn\" (UID: \"90303a73-fb9d-454b-a241-ffacdb554862\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" Nov 29 14:40:12 crc kubenswrapper[4907]: I1129 14:40:12.961282 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvjc4\" (UniqueName: \"kubernetes.io/projected/90303a73-fb9d-454b-a241-ffacdb554862-kube-api-access-qvjc4\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn\" (UID: \"90303a73-fb9d-454b-a241-ffacdb554862\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" Nov 29 14:40:13 crc kubenswrapper[4907]: I1129 14:40:13.013826 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" Nov 29 14:40:13 crc kubenswrapper[4907]: I1129 14:40:13.509090 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn"] Nov 29 14:40:14 crc kubenswrapper[4907]: I1129 14:40:14.128056 4907 generic.go:334] "Generic (PLEG): container finished" podID="90303a73-fb9d-454b-a241-ffacdb554862" containerID="d89736fa6e06b24b8674bae93fe61a41056a7c1c5757bb753d252a21acc6be7d" exitCode=0 Nov 29 14:40:14 crc kubenswrapper[4907]: I1129 14:40:14.128117 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" event={"ID":"90303a73-fb9d-454b-a241-ffacdb554862","Type":"ContainerDied","Data":"d89736fa6e06b24b8674bae93fe61a41056a7c1c5757bb753d252a21acc6be7d"} Nov 29 14:40:14 crc kubenswrapper[4907]: I1129 14:40:14.128466 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" event={"ID":"90303a73-fb9d-454b-a241-ffacdb554862","Type":"ContainerStarted","Data":"256db082d77c1600634b2bae0a96696b27392e2dff6013aba1843bb4b64ebf0f"} Nov 29 14:40:14 crc kubenswrapper[4907]: I1129 14:40:14.133018 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 14:40:16 crc kubenswrapper[4907]: I1129 14:40:16.150524 4907 generic.go:334] "Generic (PLEG): container finished" podID="90303a73-fb9d-454b-a241-ffacdb554862" containerID="d3a431052b1611412692a4153117d57b78e46bcb065c6477d68eb4c30a3e8c9f" exitCode=0 Nov 29 14:40:16 crc kubenswrapper[4907]: I1129 14:40:16.150604 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" event={"ID":"90303a73-fb9d-454b-a241-ffacdb554862","Type":"ContainerDied","Data":"d3a431052b1611412692a4153117d57b78e46bcb065c6477d68eb4c30a3e8c9f"} Nov 29 14:40:17 crc kubenswrapper[4907]: I1129 14:40:17.161172 4907 generic.go:334] "Generic (PLEG): container finished" podID="90303a73-fb9d-454b-a241-ffacdb554862" containerID="41512ff1b3315578770aa1f30aa9293cee747b2cd064416aac5aa51bc208fab3" exitCode=0 Nov 29 14:40:17 crc kubenswrapper[4907]: I1129 14:40:17.161269 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" event={"ID":"90303a73-fb9d-454b-a241-ffacdb554862","Type":"ContainerDied","Data":"41512ff1b3315578770aa1f30aa9293cee747b2cd064416aac5aa51bc208fab3"} Nov 29 14:40:18 crc kubenswrapper[4907]: I1129 14:40:18.505344 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" Nov 29 14:40:18 crc kubenswrapper[4907]: I1129 14:40:18.641140 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvjc4\" (UniqueName: \"kubernetes.io/projected/90303a73-fb9d-454b-a241-ffacdb554862-kube-api-access-qvjc4\") pod \"90303a73-fb9d-454b-a241-ffacdb554862\" (UID: \"90303a73-fb9d-454b-a241-ffacdb554862\") " Nov 29 14:40:18 crc kubenswrapper[4907]: I1129 14:40:18.641284 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90303a73-fb9d-454b-a241-ffacdb554862-util\") pod \"90303a73-fb9d-454b-a241-ffacdb554862\" (UID: \"90303a73-fb9d-454b-a241-ffacdb554862\") " Nov 29 14:40:18 crc kubenswrapper[4907]: I1129 14:40:18.641306 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90303a73-fb9d-454b-a241-ffacdb554862-bundle\") pod \"90303a73-fb9d-454b-a241-ffacdb554862\" (UID: \"90303a73-fb9d-454b-a241-ffacdb554862\") " Nov 29 14:40:18 crc kubenswrapper[4907]: I1129 14:40:18.646379 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90303a73-fb9d-454b-a241-ffacdb554862-bundle" (OuterVolumeSpecName: "bundle") pod "90303a73-fb9d-454b-a241-ffacdb554862" (UID: "90303a73-fb9d-454b-a241-ffacdb554862"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:40:18 crc kubenswrapper[4907]: I1129 14:40:18.654920 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90303a73-fb9d-454b-a241-ffacdb554862-kube-api-access-qvjc4" (OuterVolumeSpecName: "kube-api-access-qvjc4") pod "90303a73-fb9d-454b-a241-ffacdb554862" (UID: "90303a73-fb9d-454b-a241-ffacdb554862"). InnerVolumeSpecName "kube-api-access-qvjc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:40:18 crc kubenswrapper[4907]: I1129 14:40:18.671346 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90303a73-fb9d-454b-a241-ffacdb554862-util" (OuterVolumeSpecName: "util") pod "90303a73-fb9d-454b-a241-ffacdb554862" (UID: "90303a73-fb9d-454b-a241-ffacdb554862"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:40:18 crc kubenswrapper[4907]: I1129 14:40:18.743770 4907 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90303a73-fb9d-454b-a241-ffacdb554862-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:18 crc kubenswrapper[4907]: I1129 14:40:18.743899 4907 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90303a73-fb9d-454b-a241-ffacdb554862-util\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:18 crc kubenswrapper[4907]: I1129 14:40:18.743922 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvjc4\" (UniqueName: \"kubernetes.io/projected/90303a73-fb9d-454b-a241-ffacdb554862-kube-api-access-qvjc4\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:19 crc kubenswrapper[4907]: I1129 14:40:19.180764 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" event={"ID":"90303a73-fb9d-454b-a241-ffacdb554862","Type":"ContainerDied","Data":"256db082d77c1600634b2bae0a96696b27392e2dff6013aba1843bb4b64ebf0f"} Nov 29 14:40:19 crc kubenswrapper[4907]: I1129 14:40:19.180819 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="256db082d77c1600634b2bae0a96696b27392e2dff6013aba1843bb4b64ebf0f" Nov 29 14:40:19 crc kubenswrapper[4907]: I1129 14:40:19.181079 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn" Nov 29 14:40:26 crc kubenswrapper[4907]: I1129 14:40:26.466664 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dtnl8"] Nov 29 14:40:26 crc kubenswrapper[4907]: I1129 14:40:26.467951 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovn-controller" containerID="cri-o://f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb" gracePeriod=30 Nov 29 14:40:26 crc kubenswrapper[4907]: I1129 14:40:26.468015 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="nbdb" containerID="cri-o://47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4" gracePeriod=30 Nov 29 14:40:26 crc kubenswrapper[4907]: I1129 14:40:26.468120 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe" gracePeriod=30 Nov 29 14:40:26 crc kubenswrapper[4907]: I1129 14:40:26.468103 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="sbdb" containerID="cri-o://62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a" gracePeriod=30 Nov 29 14:40:26 crc kubenswrapper[4907]: I1129 14:40:26.468193 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="northd" containerID="cri-o://6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29" gracePeriod=30 Nov 29 14:40:26 crc kubenswrapper[4907]: I1129 14:40:26.468127 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovn-acl-logging" containerID="cri-o://b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813" gracePeriod=30 Nov 29 14:40:26 crc kubenswrapper[4907]: I1129 14:40:26.468116 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="kube-rbac-proxy-node" containerID="cri-o://51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d" gracePeriod=30 Nov 29 14:40:26 crc kubenswrapper[4907]: I1129 14:40:26.514649 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" containerID="cri-o://855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d" gracePeriod=30 Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.234022 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d5zvb_3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4/kube-multus/2.log" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.234615 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d5zvb_3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4/kube-multus/1.log" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.234676 4907 generic.go:334] "Generic (PLEG): container finished" podID="3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4" containerID="758c2a8240a7ddc01c0eefe154215e74709991c70756567f3ce4c50d9d63ef7f" exitCode=2 Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.234765 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5zvb" event={"ID":"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4","Type":"ContainerDied","Data":"758c2a8240a7ddc01c0eefe154215e74709991c70756567f3ce4c50d9d63ef7f"} Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.234849 4907 scope.go:117] "RemoveContainer" containerID="6d855997e199e8c32067f8e32d958526cdb8a19406794035937f3e7f77cb9bc8" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.235579 4907 scope.go:117] "RemoveContainer" containerID="758c2a8240a7ddc01c0eefe154215e74709991c70756567f3ce4c50d9d63ef7f" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.235961 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-d5zvb_openshift-multus(3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4)\"" pod="openshift-multus/multus-d5zvb" podUID="3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.237086 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovnkube-controller/3.log" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.239421 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovn-acl-logging/0.log" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.239957 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovn-controller/0.log" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.240324 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5339013-9544-4e7e-a449-c257f1086638" containerID="855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d" exitCode=0 Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.240351 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5339013-9544-4e7e-a449-c257f1086638" containerID="62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a" exitCode=0 Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.240361 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5339013-9544-4e7e-a449-c257f1086638" containerID="47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4" exitCode=0 Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.240369 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5339013-9544-4e7e-a449-c257f1086638" containerID="6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29" exitCode=0 Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.240380 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5339013-9544-4e7e-a449-c257f1086638" containerID="b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813" exitCode=143 Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.240391 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5339013-9544-4e7e-a449-c257f1086638" containerID="f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb" exitCode=143 Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.240402 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerDied","Data":"855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d"} Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.240464 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerDied","Data":"62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a"} Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.240478 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerDied","Data":"47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4"} Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.240489 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerDied","Data":"6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29"} Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.240501 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerDied","Data":"b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813"} Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.240514 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerDied","Data":"f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb"} Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.333411 4907 scope.go:117] "RemoveContainer" containerID="34d955dd674af8ad5752a7feae7ac1e947a75782da0bcba8379766951d1d6c92" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.742100 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovn-acl-logging/0.log" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.742512 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovn-controller/0.log" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.742835 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804026 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-etc-openvswitch\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804066 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804085 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-run-ovn-kubernetes\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804112 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5339013-9544-4e7e-a449-c257f1086638-ovn-node-metrics-cert\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804135 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-env-overrides\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804152 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-node-log\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804173 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-openvswitch\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804162 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804190 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-var-lib-openvswitch\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804206 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-kubelet\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804222 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804230 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-run-netns\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804250 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-ovnkube-script-lib\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804217 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804273 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804249 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-node-log" (OuterVolumeSpecName: "node-log") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804257 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804278 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grsbm\" (UniqueName: \"kubernetes.io/projected/e5339013-9544-4e7e-a449-c257f1086638-kube-api-access-grsbm\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804300 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804354 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-systemd-units\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804334 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804394 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-systemd\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804411 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804425 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-ovnkube-config\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804459 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-ovn\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804494 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-log-socket\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804508 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-cni-netd\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804532 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-cni-bin\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804550 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-slash\") pod \"e5339013-9544-4e7e-a449-c257f1086638\" (UID: \"e5339013-9544-4e7e-a449-c257f1086638\") " Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804530 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804611 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804653 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804674 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-log-socket" (OuterVolumeSpecName: "log-socket") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804692 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804714 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-slash" (OuterVolumeSpecName: "host-slash") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804744 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.804938 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805039 4907 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805053 4907 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805073 4907 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805083 4907 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805092 4907 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-node-log\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805101 4907 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805108 4907 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805116 4907 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805126 4907 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805135 4907 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805142 4907 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805151 4907 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805159 4907 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5339013-9544-4e7e-a449-c257f1086638-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805166 4907 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-log-socket\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805174 4907 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805182 4907 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.805189 4907 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-host-slash\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.829688 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5339013-9544-4e7e-a449-c257f1086638-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.829724 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5339013-9544-4e7e-a449-c257f1086638-kube-api-access-grsbm" (OuterVolumeSpecName: "kube-api-access-grsbm") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "kube-api-access-grsbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.838961 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-khkjj"] Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.839232 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.839253 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.839271 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="kube-rbac-proxy-node" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.839278 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="kube-rbac-proxy-node" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.839294 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.839300 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.839311 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="nbdb" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.839319 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="nbdb" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.839331 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="sbdb" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.839340 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="sbdb" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.839349 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovn-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.839356 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovn-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.839365 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="kubecfg-setup" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.839372 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="kubecfg-setup" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.839381 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90303a73-fb9d-454b-a241-ffacdb554862" containerName="util" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.839387 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="90303a73-fb9d-454b-a241-ffacdb554862" containerName="util" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.839397 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovn-acl-logging" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.839403 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovn-acl-logging" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.839411 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.839417 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.839426 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90303a73-fb9d-454b-a241-ffacdb554862" containerName="pull" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.839433 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="90303a73-fb9d-454b-a241-ffacdb554862" containerName="pull" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.843481 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="kube-rbac-proxy-ovn-metrics" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843489 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="kube-rbac-proxy-ovn-metrics" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.843499 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="northd" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843504 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="northd" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.843514 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90303a73-fb9d-454b-a241-ffacdb554862" containerName="extract" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843521 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="90303a73-fb9d-454b-a241-ffacdb554862" containerName="extract" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843633 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovn-acl-logging" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843643 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843651 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="nbdb" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843662 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovn-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843671 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="sbdb" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843681 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="kube-rbac-proxy-node" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843689 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843697 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="kube-rbac-proxy-ovn-metrics" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843705 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="northd" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843711 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843718 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="90303a73-fb9d-454b-a241-ffacdb554862" containerName="extract" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843726 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.843828 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843837 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.843952 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: E1129 14:40:27.844067 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.844074 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5339013-9544-4e7e-a449-c257f1086638" containerName="ovnkube-controller" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.844724 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e5339013-9544-4e7e-a449-c257f1086638" (UID: "e5339013-9544-4e7e-a449-c257f1086638"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.846326 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.905492 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-node-log\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.905532 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-cni-bin\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.905555 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ece93209-da41-4af9-b370-37edb642194e-env-overrides\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.905574 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.905638 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ece93209-da41-4af9-b370-37edb642194e-ovnkube-script-lib\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.905718 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-slash\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.905735 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-etc-openvswitch\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.905779 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-kubelet\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.905809 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-run-systemd\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.905852 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-run-ovn\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.905871 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-run-ovn-kubernetes\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.905897 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-systemd-units\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.905967 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ece93209-da41-4af9-b370-37edb642194e-ovn-node-metrics-cert\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.905984 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-run-netns\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.906003 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-cni-netd\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.906142 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-run-openvswitch\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.906205 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-var-lib-openvswitch\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.906242 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khdhx\" (UniqueName: \"kubernetes.io/projected/ece93209-da41-4af9-b370-37edb642194e-kube-api-access-khdhx\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.906291 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ece93209-da41-4af9-b370-37edb642194e-ovnkube-config\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.906324 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-log-socket\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.906434 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grsbm\" (UniqueName: \"kubernetes.io/projected/e5339013-9544-4e7e-a449-c257f1086638-kube-api-access-grsbm\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.906470 4907 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5339013-9544-4e7e-a449-c257f1086638-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:27 crc kubenswrapper[4907]: I1129 14:40:27.906482 4907 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5339013-9544-4e7e-a449-c257f1086638-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007046 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-run-openvswitch\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007107 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-var-lib-openvswitch\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007127 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khdhx\" (UniqueName: \"kubernetes.io/projected/ece93209-da41-4af9-b370-37edb642194e-kube-api-access-khdhx\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007145 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ece93209-da41-4af9-b370-37edb642194e-ovnkube-config\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007164 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-log-socket\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007177 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-run-openvswitch\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007216 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-var-lib-openvswitch\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007225 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-node-log\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007190 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-node-log\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007340 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-cni-bin\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007243 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-log-socket\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007380 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ece93209-da41-4af9-b370-37edb642194e-env-overrides\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007414 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-cni-bin\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007425 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007470 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ece93209-da41-4af9-b370-37edb642194e-ovnkube-script-lib\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007514 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-slash\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007529 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-etc-openvswitch\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007555 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007580 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-kubelet\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007561 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-kubelet\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007607 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-slash\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007621 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-run-systemd\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007654 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-run-ovn\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007674 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-run-ovn-kubernetes\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007689 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-systemd-units\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007731 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ece93209-da41-4af9-b370-37edb642194e-ovn-node-metrics-cert\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007746 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-run-netns\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007770 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-cni-netd\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007846 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-cni-netd\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007870 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-etc-openvswitch\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007890 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-run-systemd\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007913 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-run-ovn\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007934 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-run-ovn-kubernetes\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007948 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ece93209-da41-4af9-b370-37edb642194e-env-overrides\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007958 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ece93209-da41-4af9-b370-37edb642194e-ovnkube-config\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007989 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-host-run-netns\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.007953 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ece93209-da41-4af9-b370-37edb642194e-systemd-units\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.008325 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ece93209-da41-4af9-b370-37edb642194e-ovnkube-script-lib\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.011936 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ece93209-da41-4af9-b370-37edb642194e-ovn-node-metrics-cert\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.066112 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khdhx\" (UniqueName: \"kubernetes.io/projected/ece93209-da41-4af9-b370-37edb642194e-kube-api-access-khdhx\") pod \"ovnkube-node-khkjj\" (UID: \"ece93209-da41-4af9-b370-37edb642194e\") " pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.161703 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.247029 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" event={"ID":"ece93209-da41-4af9-b370-37edb642194e","Type":"ContainerStarted","Data":"b9df7fedb1d9afa02f68673ac2a0e8854201636ccbf8fca514de7998d5fff59f"} Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.250897 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovn-acl-logging/0.log" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.251562 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtnl8_e5339013-9544-4e7e-a449-c257f1086638/ovn-controller/0.log" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.251969 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5339013-9544-4e7e-a449-c257f1086638" containerID="108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe" exitCode=0 Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.252001 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5339013-9544-4e7e-a449-c257f1086638" containerID="51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d" exitCode=0 Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.252041 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerDied","Data":"108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe"} Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.252085 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.252118 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerDied","Data":"51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d"} Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.252137 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtnl8" event={"ID":"e5339013-9544-4e7e-a449-c257f1086638","Type":"ContainerDied","Data":"24a5b3a2a144c6df76a28e440dd384c01ef566ae2cf415a58902c213034e790c"} Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.252142 4907 scope.go:117] "RemoveContainer" containerID="855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.254124 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d5zvb_3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4/kube-multus/2.log" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.269692 4907 scope.go:117] "RemoveContainer" containerID="62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.287565 4907 scope.go:117] "RemoveContainer" containerID="47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.296671 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dtnl8"] Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.302017 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dtnl8"] Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.305586 4907 scope.go:117] "RemoveContainer" containerID="6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.333664 4907 scope.go:117] "RemoveContainer" containerID="108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.350768 4907 scope.go:117] "RemoveContainer" containerID="51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.369789 4907 scope.go:117] "RemoveContainer" containerID="b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.390890 4907 scope.go:117] "RemoveContainer" containerID="f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.419494 4907 scope.go:117] "RemoveContainer" containerID="4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.443113 4907 scope.go:117] "RemoveContainer" containerID="855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d" Nov 29 14:40:28 crc kubenswrapper[4907]: E1129 14:40:28.443702 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d\": container with ID starting with 855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d not found: ID does not exist" containerID="855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.443749 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d"} err="failed to get container status \"855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d\": rpc error: code = NotFound desc = could not find container \"855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d\": container with ID starting with 855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.443777 4907 scope.go:117] "RemoveContainer" containerID="62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a" Nov 29 14:40:28 crc kubenswrapper[4907]: E1129 14:40:28.444124 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\": container with ID starting with 62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a not found: ID does not exist" containerID="62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.444156 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a"} err="failed to get container status \"62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\": rpc error: code = NotFound desc = could not find container \"62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\": container with ID starting with 62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.444186 4907 scope.go:117] "RemoveContainer" containerID="47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4" Nov 29 14:40:28 crc kubenswrapper[4907]: E1129 14:40:28.444597 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\": container with ID starting with 47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4 not found: ID does not exist" containerID="47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.444623 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4"} err="failed to get container status \"47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\": rpc error: code = NotFound desc = could not find container \"47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\": container with ID starting with 47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4 not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.444638 4907 scope.go:117] "RemoveContainer" containerID="6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29" Nov 29 14:40:28 crc kubenswrapper[4907]: E1129 14:40:28.445084 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\": container with ID starting with 6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29 not found: ID does not exist" containerID="6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.445106 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29"} err="failed to get container status \"6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\": rpc error: code = NotFound desc = could not find container \"6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\": container with ID starting with 6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29 not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.445120 4907 scope.go:117] "RemoveContainer" containerID="108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe" Nov 29 14:40:28 crc kubenswrapper[4907]: E1129 14:40:28.445464 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\": container with ID starting with 108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe not found: ID does not exist" containerID="108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.445483 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe"} err="failed to get container status \"108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\": rpc error: code = NotFound desc = could not find container \"108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\": container with ID starting with 108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.445499 4907 scope.go:117] "RemoveContainer" containerID="51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d" Nov 29 14:40:28 crc kubenswrapper[4907]: E1129 14:40:28.445772 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\": container with ID starting with 51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d not found: ID does not exist" containerID="51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.445792 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d"} err="failed to get container status \"51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\": rpc error: code = NotFound desc = could not find container \"51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\": container with ID starting with 51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.445805 4907 scope.go:117] "RemoveContainer" containerID="b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813" Nov 29 14:40:28 crc kubenswrapper[4907]: E1129 14:40:28.446296 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\": container with ID starting with b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813 not found: ID does not exist" containerID="b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.446346 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813"} err="failed to get container status \"b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\": rpc error: code = NotFound desc = could not find container \"b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\": container with ID starting with b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813 not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.446392 4907 scope.go:117] "RemoveContainer" containerID="f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb" Nov 29 14:40:28 crc kubenswrapper[4907]: E1129 14:40:28.446715 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\": container with ID starting with f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb not found: ID does not exist" containerID="f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.446747 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb"} err="failed to get container status \"f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\": rpc error: code = NotFound desc = could not find container \"f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\": container with ID starting with f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.446762 4907 scope.go:117] "RemoveContainer" containerID="4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76" Nov 29 14:40:28 crc kubenswrapper[4907]: E1129 14:40:28.447131 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\": container with ID starting with 4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76 not found: ID does not exist" containerID="4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.447166 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76"} err="failed to get container status \"4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\": rpc error: code = NotFound desc = could not find container \"4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\": container with ID starting with 4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76 not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.447187 4907 scope.go:117] "RemoveContainer" containerID="855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.447630 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d"} err="failed to get container status \"855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d\": rpc error: code = NotFound desc = could not find container \"855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d\": container with ID starting with 855aaef7531ec868ab231ee8d8e24923f6bbc0fafd81047ec0d90ee9d3e3997d not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.447662 4907 scope.go:117] "RemoveContainer" containerID="62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.448050 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a"} err="failed to get container status \"62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\": rpc error: code = NotFound desc = could not find container \"62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a\": container with ID starting with 62ac4648b9d591bc3810fffef050f1567a12630f21a8463bd012ad10e35f752a not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.448076 4907 scope.go:117] "RemoveContainer" containerID="47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.448254 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4"} err="failed to get container status \"47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\": rpc error: code = NotFound desc = could not find container \"47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4\": container with ID starting with 47959948a9ab753ff8bd56e0ce0d600e4b46b075ca635d9bc8afb7c04c1c54b4 not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.448274 4907 scope.go:117] "RemoveContainer" containerID="6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.448540 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29"} err="failed to get container status \"6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\": rpc error: code = NotFound desc = could not find container \"6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29\": container with ID starting with 6d670320219d21cdc1290b0f9473729cf796535b580d5b0a96b2b13e36be1e29 not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.448560 4907 scope.go:117] "RemoveContainer" containerID="108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.448772 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe"} err="failed to get container status \"108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\": rpc error: code = NotFound desc = could not find container \"108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe\": container with ID starting with 108afceff38811a8467cbee0c2569cdf9a9fa5b1a55b2a676689967bb54c6efe not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.448791 4907 scope.go:117] "RemoveContainer" containerID="51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.448981 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d"} err="failed to get container status \"51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\": rpc error: code = NotFound desc = could not find container \"51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d\": container with ID starting with 51716d9c707cb4a3d79a6b5759ad8969f00ae1ebce49b5d9546f04b02705d43d not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.449001 4907 scope.go:117] "RemoveContainer" containerID="b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.449167 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813"} err="failed to get container status \"b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\": rpc error: code = NotFound desc = could not find container \"b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813\": container with ID starting with b4db4d938893d26a36107dd0b57ce26fbca0bc4f07142526850797d9bc57e813 not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.449188 4907 scope.go:117] "RemoveContainer" containerID="f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.449448 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb"} err="failed to get container status \"f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\": rpc error: code = NotFound desc = could not find container \"f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb\": container with ID starting with f8ddea7974d811d3f84853849c6cf917965ffb6b96290ff4c19cf6bd8796baeb not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.449468 4907 scope.go:117] "RemoveContainer" containerID="4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.449649 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76"} err="failed to get container status \"4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\": rpc error: code = NotFound desc = could not find container \"4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76\": container with ID starting with 4e4cd47ba08ce26e43387f9f89dd9ea8de0324ff3161435f2754ab24aa22ae76 not found: ID does not exist" Nov 29 14:40:28 crc kubenswrapper[4907]: I1129 14:40:28.485982 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5339013-9544-4e7e-a449-c257f1086638" path="/var/lib/kubelet/pods/e5339013-9544-4e7e-a449-c257f1086638/volumes" Nov 29 14:40:29 crc kubenswrapper[4907]: I1129 14:40:29.262824 4907 generic.go:334] "Generic (PLEG): container finished" podID="ece93209-da41-4af9-b370-37edb642194e" containerID="356e3452b77451d9b0637b4d4158aa6038720cfcf56032b6a3c30a7992df5a9a" exitCode=0 Nov 29 14:40:29 crc kubenswrapper[4907]: I1129 14:40:29.262938 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" event={"ID":"ece93209-da41-4af9-b370-37edb642194e","Type":"ContainerDied","Data":"356e3452b77451d9b0637b4d4158aa6038720cfcf56032b6a3c30a7992df5a9a"} Nov 29 14:40:30 crc kubenswrapper[4907]: I1129 14:40:30.276889 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" event={"ID":"ece93209-da41-4af9-b370-37edb642194e","Type":"ContainerStarted","Data":"399124aa689432965fb0db48a9b40b38f008f211a04283872432ee1343e73864"} Nov 29 14:40:30 crc kubenswrapper[4907]: I1129 14:40:30.276947 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" event={"ID":"ece93209-da41-4af9-b370-37edb642194e","Type":"ContainerStarted","Data":"55b06b88cd25edde960526e1156e181aca00b8af239319cb44627a8d9548fe7c"} Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.286081 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" event={"ID":"ece93209-da41-4af9-b370-37edb642194e","Type":"ContainerStarted","Data":"53590c346fe0f59da0a71247b7fc2f8bbc000cb076f496cfeefd00d9bc17c981"} Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.286738 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" event={"ID":"ece93209-da41-4af9-b370-37edb642194e","Type":"ContainerStarted","Data":"374ea5e045d34e74b6de2165b96bc4b8317383447c2ea5e561df44db2301dc7d"} Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.286755 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" event={"ID":"ece93209-da41-4af9-b370-37edb642194e","Type":"ContainerStarted","Data":"a6d219d7ddd917e03554da932df4d56cbb4b75cbeb021d85723e1a0f8d57873b"} Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.355363 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8"] Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.356980 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.360243 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.360614 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.360749 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-stjsl" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.384025 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgzz6\" (UniqueName: \"kubernetes.io/projected/5c8cbe86-4142-478f-add6-b7d0baf83de6-kube-api-access-vgzz6\") pod \"obo-prometheus-operator-668cf9dfbb-jt7c8\" (UID: \"5c8cbe86-4142-478f-add6-b7d0baf83de6\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.455255 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc"] Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.458419 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.461187 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.462908 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qjjk5" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.467296 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9"] Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.468330 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.485597 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgzz6\" (UniqueName: \"kubernetes.io/projected/5c8cbe86-4142-478f-add6-b7d0baf83de6-kube-api-access-vgzz6\") pod \"obo-prometheus-operator-668cf9dfbb-jt7c8\" (UID: \"5c8cbe86-4142-478f-add6-b7d0baf83de6\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.507977 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgzz6\" (UniqueName: \"kubernetes.io/projected/5c8cbe86-4142-478f-add6-b7d0baf83de6-kube-api-access-vgzz6\") pod \"obo-prometheus-operator-668cf9dfbb-jt7c8\" (UID: \"5c8cbe86-4142-478f-add6-b7d0baf83de6\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.588401 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b633428-8d76-48d9-bde6-b6233e1d7f40-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9\" (UID: \"4b633428-8d76-48d9-bde6-b6233e1d7f40\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.588537 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/21df79d3-1565-4ab3-bdff-8f63941a44f2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc\" (UID: \"21df79d3-1565-4ab3-bdff-8f63941a44f2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.588656 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b633428-8d76-48d9-bde6-b6233e1d7f40-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9\" (UID: \"4b633428-8d76-48d9-bde6-b6233e1d7f40\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.588715 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/21df79d3-1565-4ab3-bdff-8f63941a44f2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc\" (UID: \"21df79d3-1565-4ab3-bdff-8f63941a44f2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.622356 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-sks2s"] Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.634898 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.637407 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-48k9p" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.642211 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.676226 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.691050 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b633428-8d76-48d9-bde6-b6233e1d7f40-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9\" (UID: \"4b633428-8d76-48d9-bde6-b6233e1d7f40\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.691122 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e59fbf3-ac79-42b2-84c9-f2afa27c4efb-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-sks2s\" (UID: \"9e59fbf3-ac79-42b2-84c9-f2afa27c4efb\") " pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.691146 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/21df79d3-1565-4ab3-bdff-8f63941a44f2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc\" (UID: \"21df79d3-1565-4ab3-bdff-8f63941a44f2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.691167 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b5k4\" (UniqueName: \"kubernetes.io/projected/9e59fbf3-ac79-42b2-84c9-f2afa27c4efb-kube-api-access-4b5k4\") pod \"observability-operator-d8bb48f5d-sks2s\" (UID: \"9e59fbf3-ac79-42b2-84c9-f2afa27c4efb\") " pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.691205 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b633428-8d76-48d9-bde6-b6233e1d7f40-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9\" (UID: \"4b633428-8d76-48d9-bde6-b6233e1d7f40\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.691229 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/21df79d3-1565-4ab3-bdff-8f63941a44f2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc\" (UID: \"21df79d3-1565-4ab3-bdff-8f63941a44f2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.694673 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/21df79d3-1565-4ab3-bdff-8f63941a44f2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc\" (UID: \"21df79d3-1565-4ab3-bdff-8f63941a44f2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.697427 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b633428-8d76-48d9-bde6-b6233e1d7f40-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9\" (UID: \"4b633428-8d76-48d9-bde6-b6233e1d7f40\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.697877 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/21df79d3-1565-4ab3-bdff-8f63941a44f2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc\" (UID: \"21df79d3-1565-4ab3-bdff-8f63941a44f2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.697980 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b633428-8d76-48d9-bde6-b6233e1d7f40-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9\" (UID: \"4b633428-8d76-48d9-bde6-b6233e1d7f40\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.702802 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-jt7c8_openshift-operators_5c8cbe86-4142-478f-add6-b7d0baf83de6_0(4641253aa3ad5ade069e136273e42c1f4238ea50583df5a69977ac56b9865d33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.702918 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-jt7c8_openshift-operators_5c8cbe86-4142-478f-add6-b7d0baf83de6_0(4641253aa3ad5ade069e136273e42c1f4238ea50583df5a69977ac56b9865d33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.702947 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-jt7c8_openshift-operators_5c8cbe86-4142-478f-add6-b7d0baf83de6_0(4641253aa3ad5ade069e136273e42c1f4238ea50583df5a69977ac56b9865d33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.703008 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-jt7c8_openshift-operators(5c8cbe86-4142-478f-add6-b7d0baf83de6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-jt7c8_openshift-operators(5c8cbe86-4142-478f-add6-b7d0baf83de6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-jt7c8_openshift-operators_5c8cbe86-4142-478f-add6-b7d0baf83de6_0(4641253aa3ad5ade069e136273e42c1f4238ea50583df5a69977ac56b9865d33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" podUID="5c8cbe86-4142-478f-add6-b7d0baf83de6" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.767500 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-5dnkd"] Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.768356 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.771715 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-bjrjl" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.773933 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.781632 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.793398 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/258d53e6-9789-4a47-8c51-e928f0ad0f6b-openshift-service-ca\") pod \"perses-operator-5446b9c989-5dnkd\" (UID: \"258d53e6-9789-4a47-8c51-e928f0ad0f6b\") " pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.793502 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e59fbf3-ac79-42b2-84c9-f2afa27c4efb-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-sks2s\" (UID: \"9e59fbf3-ac79-42b2-84c9-f2afa27c4efb\") " pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.793537 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfxxv\" (UniqueName: \"kubernetes.io/projected/258d53e6-9789-4a47-8c51-e928f0ad0f6b-kube-api-access-bfxxv\") pod \"perses-operator-5446b9c989-5dnkd\" (UID: \"258d53e6-9789-4a47-8c51-e928f0ad0f6b\") " pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.793567 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b5k4\" (UniqueName: \"kubernetes.io/projected/9e59fbf3-ac79-42b2-84c9-f2afa27c4efb-kube-api-access-4b5k4\") pod \"observability-operator-d8bb48f5d-sks2s\" (UID: \"9e59fbf3-ac79-42b2-84c9-f2afa27c4efb\") " pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.799132 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e59fbf3-ac79-42b2-84c9-f2afa27c4efb-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-sks2s\" (UID: \"9e59fbf3-ac79-42b2-84c9-f2afa27c4efb\") " pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.811583 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc_openshift-operators_21df79d3-1565-4ab3-bdff-8f63941a44f2_0(41000ec4654eb752d198f8b2d316dc7741078d001b55ea7a839700acecec83eb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.811630 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc_openshift-operators_21df79d3-1565-4ab3-bdff-8f63941a44f2_0(41000ec4654eb752d198f8b2d316dc7741078d001b55ea7a839700acecec83eb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.811649 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc_openshift-operators_21df79d3-1565-4ab3-bdff-8f63941a44f2_0(41000ec4654eb752d198f8b2d316dc7741078d001b55ea7a839700acecec83eb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.811690 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc_openshift-operators(21df79d3-1565-4ab3-bdff-8f63941a44f2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc_openshift-operators(21df79d3-1565-4ab3-bdff-8f63941a44f2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc_openshift-operators_21df79d3-1565-4ab3-bdff-8f63941a44f2_0(41000ec4654eb752d198f8b2d316dc7741078d001b55ea7a839700acecec83eb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" podUID="21df79d3-1565-4ab3-bdff-8f63941a44f2" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.815985 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b5k4\" (UniqueName: \"kubernetes.io/projected/9e59fbf3-ac79-42b2-84c9-f2afa27c4efb-kube-api-access-4b5k4\") pod \"observability-operator-d8bb48f5d-sks2s\" (UID: \"9e59fbf3-ac79-42b2-84c9-f2afa27c4efb\") " pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.818494 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9_openshift-operators_4b633428-8d76-48d9-bde6-b6233e1d7f40_0(0af5619eab465bd324c32c6c26d5064ced44c67dd4190806eaf563ad4a699e77): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.818541 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9_openshift-operators_4b633428-8d76-48d9-bde6-b6233e1d7f40_0(0af5619eab465bd324c32c6c26d5064ced44c67dd4190806eaf563ad4a699e77): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.818563 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9_openshift-operators_4b633428-8d76-48d9-bde6-b6233e1d7f40_0(0af5619eab465bd324c32c6c26d5064ced44c67dd4190806eaf563ad4a699e77): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.818613 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9_openshift-operators(4b633428-8d76-48d9-bde6-b6233e1d7f40)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9_openshift-operators(4b633428-8d76-48d9-bde6-b6233e1d7f40)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9_openshift-operators_4b633428-8d76-48d9-bde6-b6233e1d7f40_0(0af5619eab465bd324c32c6c26d5064ced44c67dd4190806eaf563ad4a699e77): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" podUID="4b633428-8d76-48d9-bde6-b6233e1d7f40" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.894311 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/258d53e6-9789-4a47-8c51-e928f0ad0f6b-openshift-service-ca\") pod \"perses-operator-5446b9c989-5dnkd\" (UID: \"258d53e6-9789-4a47-8c51-e928f0ad0f6b\") " pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.895069 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/258d53e6-9789-4a47-8c51-e928f0ad0f6b-openshift-service-ca\") pod \"perses-operator-5446b9c989-5dnkd\" (UID: \"258d53e6-9789-4a47-8c51-e928f0ad0f6b\") " pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.895147 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfxxv\" (UniqueName: \"kubernetes.io/projected/258d53e6-9789-4a47-8c51-e928f0ad0f6b-kube-api-access-bfxxv\") pod \"perses-operator-5446b9c989-5dnkd\" (UID: \"258d53e6-9789-4a47-8c51-e928f0ad0f6b\") " pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.915296 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfxxv\" (UniqueName: \"kubernetes.io/projected/258d53e6-9789-4a47-8c51-e928f0ad0f6b-kube-api-access-bfxxv\") pod \"perses-operator-5446b9c989-5dnkd\" (UID: \"258d53e6-9789-4a47-8c51-e928f0ad0f6b\") " pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:31 crc kubenswrapper[4907]: I1129 14:40:31.953142 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.973455 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-sks2s_openshift-operators_9e59fbf3-ac79-42b2-84c9-f2afa27c4efb_0(ba4aba181afffd8f878b3d199e0b5fc0d203719147a7f47ac9edca3f968b7473): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.973528 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-sks2s_openshift-operators_9e59fbf3-ac79-42b2-84c9-f2afa27c4efb_0(ba4aba181afffd8f878b3d199e0b5fc0d203719147a7f47ac9edca3f968b7473): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.973555 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-sks2s_openshift-operators_9e59fbf3-ac79-42b2-84c9-f2afa27c4efb_0(ba4aba181afffd8f878b3d199e0b5fc0d203719147a7f47ac9edca3f968b7473): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:31 crc kubenswrapper[4907]: E1129 14:40:31.973607 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-sks2s_openshift-operators(9e59fbf3-ac79-42b2-84c9-f2afa27c4efb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-sks2s_openshift-operators(9e59fbf3-ac79-42b2-84c9-f2afa27c4efb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-sks2s_openshift-operators_9e59fbf3-ac79-42b2-84c9-f2afa27c4efb_0(ba4aba181afffd8f878b3d199e0b5fc0d203719147a7f47ac9edca3f968b7473): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" podUID="9e59fbf3-ac79-42b2-84c9-f2afa27c4efb" Nov 29 14:40:32 crc kubenswrapper[4907]: I1129 14:40:32.083167 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:32 crc kubenswrapper[4907]: E1129 14:40:32.105855 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-5dnkd_openshift-operators_258d53e6-9789-4a47-8c51-e928f0ad0f6b_0(a08196997e4bafc238d4e4a9ea69d219571d87e5f7dc0db2a109a542a57c517e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 14:40:32 crc kubenswrapper[4907]: E1129 14:40:32.105921 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-5dnkd_openshift-operators_258d53e6-9789-4a47-8c51-e928f0ad0f6b_0(a08196997e4bafc238d4e4a9ea69d219571d87e5f7dc0db2a109a542a57c517e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:32 crc kubenswrapper[4907]: E1129 14:40:32.105941 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-5dnkd_openshift-operators_258d53e6-9789-4a47-8c51-e928f0ad0f6b_0(a08196997e4bafc238d4e4a9ea69d219571d87e5f7dc0db2a109a542a57c517e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:32 crc kubenswrapper[4907]: E1129 14:40:32.105987 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-5dnkd_openshift-operators(258d53e6-9789-4a47-8c51-e928f0ad0f6b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-5dnkd_openshift-operators(258d53e6-9789-4a47-8c51-e928f0ad0f6b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-5dnkd_openshift-operators_258d53e6-9789-4a47-8c51-e928f0ad0f6b_0(a08196997e4bafc238d4e4a9ea69d219571d87e5f7dc0db2a109a542a57c517e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" podUID="258d53e6-9789-4a47-8c51-e928f0ad0f6b" Nov 29 14:40:32 crc kubenswrapper[4907]: I1129 14:40:32.298497 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" event={"ID":"ece93209-da41-4af9-b370-37edb642194e","Type":"ContainerStarted","Data":"95d71b6cbac161ac2671079d62a8ce9526edd7e5ae7ba2e6608a1b4e87f9ddbe"} Nov 29 14:40:34 crc kubenswrapper[4907]: I1129 14:40:34.311884 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" event={"ID":"ece93209-da41-4af9-b370-37edb642194e","Type":"ContainerStarted","Data":"f6b017e73b514efab9a8618e7c6ec25eb1fb6dec7e37a832cc30ca933078ab14"} Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.337416 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" event={"ID":"ece93209-da41-4af9-b370-37edb642194e","Type":"ContainerStarted","Data":"ba3ca9c5a7a81c5d2884e81f1129bf45c7981d295ea9ae91eab694a725202ed8"} Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.337903 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.337914 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.374666 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.425102 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" podStartSLOduration=10.425087449 podStartE2EDuration="10.425087449s" podCreationTimestamp="2025-11-29 14:40:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:40:37.389490036 +0000 UTC m=+735.376327688" watchObservedRunningTime="2025-11-29 14:40:37.425087449 +0000 UTC m=+735.411925101" Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.562496 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9"] Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.562823 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.563385 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.570485 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8"] Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.570778 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.571292 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.577428 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc"] Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.577557 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.577895 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.581028 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-sks2s"] Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.581188 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.581620 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.602384 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-5dnkd"] Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.602518 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:37 crc kubenswrapper[4907]: I1129 14:40:37.603019 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.606904 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9_openshift-operators_4b633428-8d76-48d9-bde6-b6233e1d7f40_0(d652a3a8b0ff1412d8231b9b78c2a32b98d1da1b90e4ad38bf00c578074dbd05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.606966 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9_openshift-operators_4b633428-8d76-48d9-bde6-b6233e1d7f40_0(d652a3a8b0ff1412d8231b9b78c2a32b98d1da1b90e4ad38bf00c578074dbd05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.606992 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9_openshift-operators_4b633428-8d76-48d9-bde6-b6233e1d7f40_0(d652a3a8b0ff1412d8231b9b78c2a32b98d1da1b90e4ad38bf00c578074dbd05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.607034 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9_openshift-operators(4b633428-8d76-48d9-bde6-b6233e1d7f40)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9_openshift-operators(4b633428-8d76-48d9-bde6-b6233e1d7f40)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9_openshift-operators_4b633428-8d76-48d9-bde6-b6233e1d7f40_0(d652a3a8b0ff1412d8231b9b78c2a32b98d1da1b90e4ad38bf00c578074dbd05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" podUID="4b633428-8d76-48d9-bde6-b6233e1d7f40" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.607374 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-jt7c8_openshift-operators_5c8cbe86-4142-478f-add6-b7d0baf83de6_0(743f399a7f6ca8dd2c7fe69ade26fc53b3f681a4b4a5c2827fca67436bc1e68f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.607466 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-jt7c8_openshift-operators_5c8cbe86-4142-478f-add6-b7d0baf83de6_0(743f399a7f6ca8dd2c7fe69ade26fc53b3f681a4b4a5c2827fca67436bc1e68f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.607495 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-jt7c8_openshift-operators_5c8cbe86-4142-478f-add6-b7d0baf83de6_0(743f399a7f6ca8dd2c7fe69ade26fc53b3f681a4b4a5c2827fca67436bc1e68f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.607559 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-jt7c8_openshift-operators(5c8cbe86-4142-478f-add6-b7d0baf83de6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-jt7c8_openshift-operators(5c8cbe86-4142-478f-add6-b7d0baf83de6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-jt7c8_openshift-operators_5c8cbe86-4142-478f-add6-b7d0baf83de6_0(743f399a7f6ca8dd2c7fe69ade26fc53b3f681a4b4a5c2827fca67436bc1e68f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" podUID="5c8cbe86-4142-478f-add6-b7d0baf83de6" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.643532 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-sks2s_openshift-operators_9e59fbf3-ac79-42b2-84c9-f2afa27c4efb_0(57cb48a2c49f053a1f441a38187c7dee866dc080cf7480fe3b6e8534c3feb632): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.643849 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-sks2s_openshift-operators_9e59fbf3-ac79-42b2-84c9-f2afa27c4efb_0(57cb48a2c49f053a1f441a38187c7dee866dc080cf7480fe3b6e8534c3feb632): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.643874 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-sks2s_openshift-operators_9e59fbf3-ac79-42b2-84c9-f2afa27c4efb_0(57cb48a2c49f053a1f441a38187c7dee866dc080cf7480fe3b6e8534c3feb632): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.643922 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-sks2s_openshift-operators(9e59fbf3-ac79-42b2-84c9-f2afa27c4efb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-sks2s_openshift-operators(9e59fbf3-ac79-42b2-84c9-f2afa27c4efb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-sks2s_openshift-operators_9e59fbf3-ac79-42b2-84c9-f2afa27c4efb_0(57cb48a2c49f053a1f441a38187c7dee866dc080cf7480fe3b6e8534c3feb632): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" podUID="9e59fbf3-ac79-42b2-84c9-f2afa27c4efb" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.660773 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc_openshift-operators_21df79d3-1565-4ab3-bdff-8f63941a44f2_0(eada95444fc1f534bd198280c7965566d51aedc1e4614879ecb512f35c890ed1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.660842 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc_openshift-operators_21df79d3-1565-4ab3-bdff-8f63941a44f2_0(eada95444fc1f534bd198280c7965566d51aedc1e4614879ecb512f35c890ed1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.660866 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc_openshift-operators_21df79d3-1565-4ab3-bdff-8f63941a44f2_0(eada95444fc1f534bd198280c7965566d51aedc1e4614879ecb512f35c890ed1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.660909 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc_openshift-operators(21df79d3-1565-4ab3-bdff-8f63941a44f2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc_openshift-operators(21df79d3-1565-4ab3-bdff-8f63941a44f2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc_openshift-operators_21df79d3-1565-4ab3-bdff-8f63941a44f2_0(eada95444fc1f534bd198280c7965566d51aedc1e4614879ecb512f35c890ed1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" podUID="21df79d3-1565-4ab3-bdff-8f63941a44f2" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.672378 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-5dnkd_openshift-operators_258d53e6-9789-4a47-8c51-e928f0ad0f6b_0(86037922605d7c113b83cd6e9e588fd493055bdce4d199b3d209800627daffc9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.672431 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-5dnkd_openshift-operators_258d53e6-9789-4a47-8c51-e928f0ad0f6b_0(86037922605d7c113b83cd6e9e588fd493055bdce4d199b3d209800627daffc9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.672465 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-5dnkd_openshift-operators_258d53e6-9789-4a47-8c51-e928f0ad0f6b_0(86037922605d7c113b83cd6e9e588fd493055bdce4d199b3d209800627daffc9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:37 crc kubenswrapper[4907]: E1129 14:40:37.672502 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-5dnkd_openshift-operators(258d53e6-9789-4a47-8c51-e928f0ad0f6b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-5dnkd_openshift-operators(258d53e6-9789-4a47-8c51-e928f0ad0f6b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-5dnkd_openshift-operators_258d53e6-9789-4a47-8c51-e928f0ad0f6b_0(86037922605d7c113b83cd6e9e588fd493055bdce4d199b3d209800627daffc9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" podUID="258d53e6-9789-4a47-8c51-e928f0ad0f6b" Nov 29 14:40:38 crc kubenswrapper[4907]: I1129 14:40:38.162832 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:38 crc kubenswrapper[4907]: I1129 14:40:38.235866 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:41 crc kubenswrapper[4907]: I1129 14:40:41.479674 4907 scope.go:117] "RemoveContainer" containerID="758c2a8240a7ddc01c0eefe154215e74709991c70756567f3ce4c50d9d63ef7f" Nov 29 14:40:43 crc kubenswrapper[4907]: I1129 14:40:43.411897 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d5zvb_3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4/kube-multus/2.log" Nov 29 14:40:43 crc kubenswrapper[4907]: I1129 14:40:43.412741 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5zvb" event={"ID":"3bc30bb0-1a1a-48df-af8a-c023bfbfa3f4","Type":"ContainerStarted","Data":"0d61f3b47929a0f398900a635a7fb3c1a5d3081ae84ff5a8948dc1038d4c76ed"} Nov 29 14:40:48 crc kubenswrapper[4907]: I1129 14:40:48.479128 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:48 crc kubenswrapper[4907]: I1129 14:40:48.480230 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:40:48 crc kubenswrapper[4907]: I1129 14:40:48.961162 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-5dnkd"] Nov 29 14:40:49 crc kubenswrapper[4907]: I1129 14:40:49.463036 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" event={"ID":"258d53e6-9789-4a47-8c51-e928f0ad0f6b","Type":"ContainerStarted","Data":"d49e237b35852bfd3d7ccd8a53c4cbed2e07f6261d2a4a678ccd9528d99b3165"} Nov 29 14:40:51 crc kubenswrapper[4907]: I1129 14:40:51.478739 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:51 crc kubenswrapper[4907]: I1129 14:40:51.479507 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:40:51 crc kubenswrapper[4907]: I1129 14:40:51.694046 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-sks2s"] Nov 29 14:40:52 crc kubenswrapper[4907]: I1129 14:40:52.478576 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" Nov 29 14:40:52 crc kubenswrapper[4907]: I1129 14:40:52.478664 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:52 crc kubenswrapper[4907]: I1129 14:40:52.483212 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" Nov 29 14:40:52 crc kubenswrapper[4907]: I1129 14:40:52.483367 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" Nov 29 14:40:52 crc kubenswrapper[4907]: I1129 14:40:52.491346 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" event={"ID":"9e59fbf3-ac79-42b2-84c9-f2afa27c4efb","Type":"ContainerStarted","Data":"6e99c6f488f59c370827d93b51c501e3093391700f3db1c64d93b1a87872a941"} Nov 29 14:40:52 crc kubenswrapper[4907]: I1129 14:40:52.748629 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8"] Nov 29 14:40:52 crc kubenswrapper[4907]: I1129 14:40:52.826712 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9"] Nov 29 14:40:53 crc kubenswrapper[4907]: I1129 14:40:53.479426 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:53 crc kubenswrapper[4907]: I1129 14:40:53.480079 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" Nov 29 14:40:53 crc kubenswrapper[4907]: I1129 14:40:53.502404 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" event={"ID":"4b633428-8d76-48d9-bde6-b6233e1d7f40","Type":"ContainerStarted","Data":"964430483f4d1aa5a9924050c9cdc0785e1667ac65d1c9b45017b8e67302c3c7"} Nov 29 14:40:53 crc kubenswrapper[4907]: I1129 14:40:53.504802 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" event={"ID":"5c8cbe86-4142-478f-add6-b7d0baf83de6","Type":"ContainerStarted","Data":"6507ae0ab6978c6044ec063a027ce48eb1e52f4c79313360979caa486ab8eb18"} Nov 29 14:40:53 crc kubenswrapper[4907]: I1129 14:40:53.953003 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc"] Nov 29 14:40:57 crc kubenswrapper[4907]: I1129 14:40:57.545787 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" event={"ID":"21df79d3-1565-4ab3-bdff-8f63941a44f2","Type":"ContainerStarted","Data":"b753277813707704580c499c046fa61f8e849ca0c084be797480e011dc853bf0"} Nov 29 14:40:58 crc kubenswrapper[4907]: I1129 14:40:58.222118 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-khkjj" Nov 29 14:40:58 crc kubenswrapper[4907]: I1129 14:40:58.490270 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:40:58 crc kubenswrapper[4907]: I1129 14:40:58.490334 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:41:02 crc kubenswrapper[4907]: I1129 14:41:02.473900 4907 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 29 14:41:04 crc kubenswrapper[4907]: I1129 14:41:04.617651 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" event={"ID":"4b633428-8d76-48d9-bde6-b6233e1d7f40","Type":"ContainerStarted","Data":"45fefd2308e0b18a6a589d84ab4fdee3754fada8ad271bfcd68740a71448c356"} Nov 29 14:41:04 crc kubenswrapper[4907]: I1129 14:41:04.620196 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" event={"ID":"258d53e6-9789-4a47-8c51-e928f0ad0f6b","Type":"ContainerStarted","Data":"030718ec6764a77b0ff5d847dbcda0b86e53141c3a14e17e53a77ce3f192f02f"} Nov 29 14:41:04 crc kubenswrapper[4907]: I1129 14:41:04.621600 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:41:04 crc kubenswrapper[4907]: I1129 14:41:04.623497 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" event={"ID":"21df79d3-1565-4ab3-bdff-8f63941a44f2","Type":"ContainerStarted","Data":"32b4327c127eb1393f42f55498c9c1d50574e8c5b9e9b42fd5fb195956d9435c"} Nov 29 14:41:04 crc kubenswrapper[4907]: I1129 14:41:04.625996 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" event={"ID":"5c8cbe86-4142-478f-add6-b7d0baf83de6","Type":"ContainerStarted","Data":"1efd29b508f2a4fcbe0daed8b9aadedb6ca0eec8609cfbca9fba40e0768c1902"} Nov 29 14:41:04 crc kubenswrapper[4907]: I1129 14:41:04.628323 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" event={"ID":"9e59fbf3-ac79-42b2-84c9-f2afa27c4efb","Type":"ContainerStarted","Data":"965e515423b96f8eb846d38ebd15122982437b46431da78ec93fdde80ab072f6"} Nov 29 14:41:04 crc kubenswrapper[4907]: I1129 14:41:04.629579 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:41:04 crc kubenswrapper[4907]: I1129 14:41:04.639433 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" Nov 29 14:41:04 crc kubenswrapper[4907]: I1129 14:41:04.662787 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9" podStartSLOduration=22.921289952 podStartE2EDuration="33.662758846s" podCreationTimestamp="2025-11-29 14:40:31 +0000 UTC" firstStartedPulling="2025-11-29 14:40:52.845780993 +0000 UTC m=+750.832618645" lastFinishedPulling="2025-11-29 14:41:03.587249887 +0000 UTC m=+761.574087539" observedRunningTime="2025-11-29 14:41:04.648430383 +0000 UTC m=+762.635268135" watchObservedRunningTime="2025-11-29 14:41:04.662758846 +0000 UTC m=+762.649596508" Nov 29 14:41:04 crc kubenswrapper[4907]: I1129 14:41:04.702135 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc" podStartSLOduration=27.58807033 podStartE2EDuration="33.702107026s" podCreationTimestamp="2025-11-29 14:40:31 +0000 UTC" firstStartedPulling="2025-11-29 14:40:57.494222364 +0000 UTC m=+755.481060016" lastFinishedPulling="2025-11-29 14:41:03.60825906 +0000 UTC m=+761.595096712" observedRunningTime="2025-11-29 14:41:04.699225585 +0000 UTC m=+762.686063287" watchObservedRunningTime="2025-11-29 14:41:04.702107026 +0000 UTC m=+762.688944698" Nov 29 14:41:04 crc kubenswrapper[4907]: I1129 14:41:04.732221 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jt7c8" podStartSLOduration=22.913599455 podStartE2EDuration="33.732196694s" podCreationTimestamp="2025-11-29 14:40:31 +0000 UTC" firstStartedPulling="2025-11-29 14:40:52.768670129 +0000 UTC m=+750.755507781" lastFinishedPulling="2025-11-29 14:41:03.587267368 +0000 UTC m=+761.574105020" observedRunningTime="2025-11-29 14:41:04.724074285 +0000 UTC m=+762.710911967" watchObservedRunningTime="2025-11-29 14:41:04.732196694 +0000 UTC m=+762.719034356" Nov 29 14:41:04 crc kubenswrapper[4907]: I1129 14:41:04.753421 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-sks2s" podStartSLOduration=21.935103881 podStartE2EDuration="33.753388491s" podCreationTimestamp="2025-11-29 14:40:31 +0000 UTC" firstStartedPulling="2025-11-29 14:40:51.76942357 +0000 UTC m=+749.756261222" lastFinishedPulling="2025-11-29 14:41:03.58770815 +0000 UTC m=+761.574545832" observedRunningTime="2025-11-29 14:41:04.748958777 +0000 UTC m=+762.735796439" watchObservedRunningTime="2025-11-29 14:41:04.753388491 +0000 UTC m=+762.740226143" Nov 29 14:41:04 crc kubenswrapper[4907]: I1129 14:41:04.779555 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" podStartSLOduration=19.24231999 podStartE2EDuration="33.779532008s" podCreationTimestamp="2025-11-29 14:40:31 +0000 UTC" firstStartedPulling="2025-11-29 14:40:48.972965967 +0000 UTC m=+746.959803649" lastFinishedPulling="2025-11-29 14:41:03.510177975 +0000 UTC m=+761.497015667" observedRunningTime="2025-11-29 14:41:04.773346394 +0000 UTC m=+762.760184046" watchObservedRunningTime="2025-11-29 14:41:04.779532008 +0000 UTC m=+762.766369660" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.655132 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-ps9pn"] Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.657127 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-ps9pn" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.659334 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.659380 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.659582 4907 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-tt4tp" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.663664 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-j4z4g"] Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.664853 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-j4z4g" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.667996 4907 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-5zcj6" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.674043 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-79pdn"] Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.675157 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-79pdn" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.679063 4907 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-klmt6" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.682983 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-ps9pn"] Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.685644 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-j4z4g"] Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.715460 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-79pdn"] Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.854540 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vbg5\" (UniqueName: \"kubernetes.io/projected/b2590975-d739-4401-ae0d-8ef8dd6ba179-kube-api-access-4vbg5\") pod \"cert-manager-5b446d88c5-j4z4g\" (UID: \"b2590975-d739-4401-ae0d-8ef8dd6ba179\") " pod="cert-manager/cert-manager-5b446d88c5-j4z4g" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.854612 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwf2c\" (UniqueName: \"kubernetes.io/projected/a1af2f3d-c619-4288-93c6-721cb89dc1cf-kube-api-access-qwf2c\") pod \"cert-manager-cainjector-7f985d654d-ps9pn\" (UID: \"a1af2f3d-c619-4288-93c6-721cb89dc1cf\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-ps9pn" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.854804 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqlrs\" (UniqueName: \"kubernetes.io/projected/11b5343c-652a-4d02-841e-2373f5b9f0cf-kube-api-access-pqlrs\") pod \"cert-manager-webhook-5655c58dd6-79pdn\" (UID: \"11b5343c-652a-4d02-841e-2373f5b9f0cf\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-79pdn" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.956800 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqlrs\" (UniqueName: \"kubernetes.io/projected/11b5343c-652a-4d02-841e-2373f5b9f0cf-kube-api-access-pqlrs\") pod \"cert-manager-webhook-5655c58dd6-79pdn\" (UID: \"11b5343c-652a-4d02-841e-2373f5b9f0cf\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-79pdn" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.956901 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vbg5\" (UniqueName: \"kubernetes.io/projected/b2590975-d739-4401-ae0d-8ef8dd6ba179-kube-api-access-4vbg5\") pod \"cert-manager-5b446d88c5-j4z4g\" (UID: \"b2590975-d739-4401-ae0d-8ef8dd6ba179\") " pod="cert-manager/cert-manager-5b446d88c5-j4z4g" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.956964 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwf2c\" (UniqueName: \"kubernetes.io/projected/a1af2f3d-c619-4288-93c6-721cb89dc1cf-kube-api-access-qwf2c\") pod \"cert-manager-cainjector-7f985d654d-ps9pn\" (UID: \"a1af2f3d-c619-4288-93c6-721cb89dc1cf\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-ps9pn" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.981204 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwf2c\" (UniqueName: \"kubernetes.io/projected/a1af2f3d-c619-4288-93c6-721cb89dc1cf-kube-api-access-qwf2c\") pod \"cert-manager-cainjector-7f985d654d-ps9pn\" (UID: \"a1af2f3d-c619-4288-93c6-721cb89dc1cf\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-ps9pn" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.983237 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vbg5\" (UniqueName: \"kubernetes.io/projected/b2590975-d739-4401-ae0d-8ef8dd6ba179-kube-api-access-4vbg5\") pod \"cert-manager-5b446d88c5-j4z4g\" (UID: \"b2590975-d739-4401-ae0d-8ef8dd6ba179\") " pod="cert-manager/cert-manager-5b446d88c5-j4z4g" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.989822 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqlrs\" (UniqueName: \"kubernetes.io/projected/11b5343c-652a-4d02-841e-2373f5b9f0cf-kube-api-access-pqlrs\") pod \"cert-manager-webhook-5655c58dd6-79pdn\" (UID: \"11b5343c-652a-4d02-841e-2373f5b9f0cf\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-79pdn" Nov 29 14:41:10 crc kubenswrapper[4907]: I1129 14:41:10.991938 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-ps9pn" Nov 29 14:41:11 crc kubenswrapper[4907]: I1129 14:41:11.005815 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-j4z4g" Nov 29 14:41:11 crc kubenswrapper[4907]: I1129 14:41:11.021853 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-79pdn" Nov 29 14:41:11 crc kubenswrapper[4907]: I1129 14:41:11.525685 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-ps9pn"] Nov 29 14:41:11 crc kubenswrapper[4907]: I1129 14:41:11.570477 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-79pdn"] Nov 29 14:41:11 crc kubenswrapper[4907]: I1129 14:41:11.578332 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-j4z4g"] Nov 29 14:41:11 crc kubenswrapper[4907]: W1129 14:41:11.580631 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b5343c_652a_4d02_841e_2373f5b9f0cf.slice/crio-b2cf95f6527969f60514fa9c51b9cc9660e6e58d3ff9b4c774a0f84103f9b4fd WatchSource:0}: Error finding container b2cf95f6527969f60514fa9c51b9cc9660e6e58d3ff9b4c774a0f84103f9b4fd: Status 404 returned error can't find the container with id b2cf95f6527969f60514fa9c51b9cc9660e6e58d3ff9b4c774a0f84103f9b4fd Nov 29 14:41:11 crc kubenswrapper[4907]: W1129 14:41:11.588081 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2590975_d739_4401_ae0d_8ef8dd6ba179.slice/crio-3c5f2fafda7673a5a5ac18d523c811d42d7c1ea83631d376bb8ccbaa352002fa WatchSource:0}: Error finding container 3c5f2fafda7673a5a5ac18d523c811d42d7c1ea83631d376bb8ccbaa352002fa: Status 404 returned error can't find the container with id 3c5f2fafda7673a5a5ac18d523c811d42d7c1ea83631d376bb8ccbaa352002fa Nov 29 14:41:11 crc kubenswrapper[4907]: I1129 14:41:11.673052 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-j4z4g" event={"ID":"b2590975-d739-4401-ae0d-8ef8dd6ba179","Type":"ContainerStarted","Data":"3c5f2fafda7673a5a5ac18d523c811d42d7c1ea83631d376bb8ccbaa352002fa"} Nov 29 14:41:11 crc kubenswrapper[4907]: I1129 14:41:11.674719 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-79pdn" event={"ID":"11b5343c-652a-4d02-841e-2373f5b9f0cf","Type":"ContainerStarted","Data":"b2cf95f6527969f60514fa9c51b9cc9660e6e58d3ff9b4c774a0f84103f9b4fd"} Nov 29 14:41:11 crc kubenswrapper[4907]: I1129 14:41:11.675758 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-ps9pn" event={"ID":"a1af2f3d-c619-4288-93c6-721cb89dc1cf","Type":"ContainerStarted","Data":"5291cbcfbb3e010b4127e62edf14de91ebc108c63b503065e088f2004a515c45"} Nov 29 14:41:12 crc kubenswrapper[4907]: I1129 14:41:12.087637 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-5dnkd" Nov 29 14:41:16 crc kubenswrapper[4907]: I1129 14:41:16.705815 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-ps9pn" event={"ID":"a1af2f3d-c619-4288-93c6-721cb89dc1cf","Type":"ContainerStarted","Data":"9641f735a1061def18c23b99559d6d7bd6b22c7a7425d4db5ebc230df29dd18f"} Nov 29 14:41:16 crc kubenswrapper[4907]: I1129 14:41:16.708577 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-j4z4g" event={"ID":"b2590975-d739-4401-ae0d-8ef8dd6ba179","Type":"ContainerStarted","Data":"abde0f18c5b322b96512d42fe777729c33c75eb48da624b087120efed5f80d02"} Nov 29 14:41:16 crc kubenswrapper[4907]: I1129 14:41:16.709815 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-79pdn" event={"ID":"11b5343c-652a-4d02-841e-2373f5b9f0cf","Type":"ContainerStarted","Data":"6ac153c6538bd37909d35f4c4316a82ab4be40fb3249af4a4b53a0612ae17e43"} Nov 29 14:41:16 crc kubenswrapper[4907]: I1129 14:41:16.709922 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-79pdn" Nov 29 14:41:16 crc kubenswrapper[4907]: I1129 14:41:16.728614 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-ps9pn" podStartSLOduration=2.034490748 podStartE2EDuration="6.728598494s" podCreationTimestamp="2025-11-29 14:41:10 +0000 UTC" firstStartedPulling="2025-11-29 14:41:11.536145159 +0000 UTC m=+769.522982851" lastFinishedPulling="2025-11-29 14:41:16.230252945 +0000 UTC m=+774.217090597" observedRunningTime="2025-11-29 14:41:16.725391864 +0000 UTC m=+774.712229526" watchObservedRunningTime="2025-11-29 14:41:16.728598494 +0000 UTC m=+774.715436146" Nov 29 14:41:16 crc kubenswrapper[4907]: I1129 14:41:16.753557 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-79pdn" podStartSLOduration=2.098740379 podStartE2EDuration="6.753538507s" podCreationTimestamp="2025-11-29 14:41:10 +0000 UTC" firstStartedPulling="2025-11-29 14:41:11.58299762 +0000 UTC m=+769.569835272" lastFinishedPulling="2025-11-29 14:41:16.237795748 +0000 UTC m=+774.224633400" observedRunningTime="2025-11-29 14:41:16.748006431 +0000 UTC m=+774.734844083" watchObservedRunningTime="2025-11-29 14:41:16.753538507 +0000 UTC m=+774.740376159" Nov 29 14:41:16 crc kubenswrapper[4907]: I1129 14:41:16.788522 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-j4z4g" podStartSLOduration=2.128983971 podStartE2EDuration="6.788489002s" podCreationTimestamp="2025-11-29 14:41:10 +0000 UTC" firstStartedPulling="2025-11-29 14:41:11.590184752 +0000 UTC m=+769.577022394" lastFinishedPulling="2025-11-29 14:41:16.249689773 +0000 UTC m=+774.236527425" observedRunningTime="2025-11-29 14:41:16.781731692 +0000 UTC m=+774.768569364" watchObservedRunningTime="2025-11-29 14:41:16.788489002 +0000 UTC m=+774.775326654" Nov 29 14:41:21 crc kubenswrapper[4907]: I1129 14:41:21.028564 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-79pdn" Nov 29 14:41:28 crc kubenswrapper[4907]: I1129 14:41:28.490921 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:41:28 crc kubenswrapper[4907]: I1129 14:41:28.491671 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:41:47 crc kubenswrapper[4907]: I1129 14:41:47.694690 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g"] Nov 29 14:41:47 crc kubenswrapper[4907]: I1129 14:41:47.696592 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" Nov 29 14:41:47 crc kubenswrapper[4907]: I1129 14:41:47.698663 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 29 14:41:47 crc kubenswrapper[4907]: I1129 14:41:47.720697 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/665b9a04-0d24-45c5-9129-8d37342f2674-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g\" (UID: \"665b9a04-0d24-45c5-9129-8d37342f2674\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" Nov 29 14:41:47 crc kubenswrapper[4907]: I1129 14:41:47.720928 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq2gw\" (UniqueName: \"kubernetes.io/projected/665b9a04-0d24-45c5-9129-8d37342f2674-kube-api-access-kq2gw\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g\" (UID: \"665b9a04-0d24-45c5-9129-8d37342f2674\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" Nov 29 14:41:47 crc kubenswrapper[4907]: I1129 14:41:47.721097 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/665b9a04-0d24-45c5-9129-8d37342f2674-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g\" (UID: \"665b9a04-0d24-45c5-9129-8d37342f2674\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" Nov 29 14:41:47 crc kubenswrapper[4907]: I1129 14:41:47.821640 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq2gw\" (UniqueName: \"kubernetes.io/projected/665b9a04-0d24-45c5-9129-8d37342f2674-kube-api-access-kq2gw\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g\" (UID: \"665b9a04-0d24-45c5-9129-8d37342f2674\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" Nov 29 14:41:47 crc kubenswrapper[4907]: I1129 14:41:47.821708 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/665b9a04-0d24-45c5-9129-8d37342f2674-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g\" (UID: \"665b9a04-0d24-45c5-9129-8d37342f2674\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" Nov 29 14:41:47 crc kubenswrapper[4907]: I1129 14:41:47.821747 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/665b9a04-0d24-45c5-9129-8d37342f2674-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g\" (UID: \"665b9a04-0d24-45c5-9129-8d37342f2674\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" Nov 29 14:41:47 crc kubenswrapper[4907]: I1129 14:41:47.822125 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/665b9a04-0d24-45c5-9129-8d37342f2674-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g\" (UID: \"665b9a04-0d24-45c5-9129-8d37342f2674\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" Nov 29 14:41:47 crc kubenswrapper[4907]: I1129 14:41:47.822218 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/665b9a04-0d24-45c5-9129-8d37342f2674-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g\" (UID: \"665b9a04-0d24-45c5-9129-8d37342f2674\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" Nov 29 14:41:47 crc kubenswrapper[4907]: I1129 14:41:47.843944 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq2gw\" (UniqueName: \"kubernetes.io/projected/665b9a04-0d24-45c5-9129-8d37342f2674-kube-api-access-kq2gw\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g\" (UID: \"665b9a04-0d24-45c5-9129-8d37342f2674\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" Nov 29 14:41:47 crc kubenswrapper[4907]: I1129 14:41:47.901294 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g"] Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.016500 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.055294 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p"] Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.057328 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.071583 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p"] Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.228401 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1016dfd-9651-4c1f-94f4-312c4eab6a00-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p\" (UID: \"d1016dfd-9651-4c1f-94f4-312c4eab6a00\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.228944 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfklr\" (UniqueName: \"kubernetes.io/projected/d1016dfd-9651-4c1f-94f4-312c4eab6a00-kube-api-access-pfklr\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p\" (UID: \"d1016dfd-9651-4c1f-94f4-312c4eab6a00\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.228972 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1016dfd-9651-4c1f-94f4-312c4eab6a00-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p\" (UID: \"d1016dfd-9651-4c1f-94f4-312c4eab6a00\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.329626 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1016dfd-9651-4c1f-94f4-312c4eab6a00-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p\" (UID: \"d1016dfd-9651-4c1f-94f4-312c4eab6a00\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.329745 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfklr\" (UniqueName: \"kubernetes.io/projected/d1016dfd-9651-4c1f-94f4-312c4eab6a00-kube-api-access-pfklr\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p\" (UID: \"d1016dfd-9651-4c1f-94f4-312c4eab6a00\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.329773 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1016dfd-9651-4c1f-94f4-312c4eab6a00-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p\" (UID: \"d1016dfd-9651-4c1f-94f4-312c4eab6a00\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.330210 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1016dfd-9651-4c1f-94f4-312c4eab6a00-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p\" (UID: \"d1016dfd-9651-4c1f-94f4-312c4eab6a00\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.330316 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1016dfd-9651-4c1f-94f4-312c4eab6a00-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p\" (UID: \"d1016dfd-9651-4c1f-94f4-312c4eab6a00\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.349856 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfklr\" (UniqueName: \"kubernetes.io/projected/d1016dfd-9651-4c1f-94f4-312c4eab6a00-kube-api-access-pfklr\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p\" (UID: \"d1016dfd-9651-4c1f-94f4-312c4eab6a00\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.399927 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.500369 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g"] Nov 29 14:41:48 crc kubenswrapper[4907]: I1129 14:41:48.890976 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p"] Nov 29 14:41:48 crc kubenswrapper[4907]: W1129 14:41:48.897798 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1016dfd_9651_4c1f_94f4_312c4eab6a00.slice/crio-f2e9a1d925ecc6f59c2dedaf4faa2307b3fa79f8008fbf5b8f6d7b6fa3a9c35c WatchSource:0}: Error finding container f2e9a1d925ecc6f59c2dedaf4faa2307b3fa79f8008fbf5b8f6d7b6fa3a9c35c: Status 404 returned error can't find the container with id f2e9a1d925ecc6f59c2dedaf4faa2307b3fa79f8008fbf5b8f6d7b6fa3a9c35c Nov 29 14:41:49 crc kubenswrapper[4907]: I1129 14:41:49.003387 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" event={"ID":"665b9a04-0d24-45c5-9129-8d37342f2674","Type":"ContainerStarted","Data":"8cc952256635c549709e6af54c11d960db6cdab1f6116d8f3c84422829254970"} Nov 29 14:41:49 crc kubenswrapper[4907]: I1129 14:41:49.004869 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" event={"ID":"d1016dfd-9651-4c1f-94f4-312c4eab6a00","Type":"ContainerStarted","Data":"f2e9a1d925ecc6f59c2dedaf4faa2307b3fa79f8008fbf5b8f6d7b6fa3a9c35c"} Nov 29 14:41:50 crc kubenswrapper[4907]: I1129 14:41:50.015153 4907 generic.go:334] "Generic (PLEG): container finished" podID="665b9a04-0d24-45c5-9129-8d37342f2674" containerID="9bea0e8ab625237715d5c0f21fdbacfa6406638e2a8abd8ef851c6bc18235e25" exitCode=0 Nov 29 14:41:50 crc kubenswrapper[4907]: I1129 14:41:50.015220 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" event={"ID":"665b9a04-0d24-45c5-9129-8d37342f2674","Type":"ContainerDied","Data":"9bea0e8ab625237715d5c0f21fdbacfa6406638e2a8abd8ef851c6bc18235e25"} Nov 29 14:41:50 crc kubenswrapper[4907]: I1129 14:41:50.017221 4907 generic.go:334] "Generic (PLEG): container finished" podID="d1016dfd-9651-4c1f-94f4-312c4eab6a00" containerID="2310a771b2737fee708b1047b63eb53aa9fee71da20d9c4950ceeee5b1c05eb3" exitCode=0 Nov 29 14:41:50 crc kubenswrapper[4907]: I1129 14:41:50.017257 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" event={"ID":"d1016dfd-9651-4c1f-94f4-312c4eab6a00","Type":"ContainerDied","Data":"2310a771b2737fee708b1047b63eb53aa9fee71da20d9c4950ceeee5b1c05eb3"} Nov 29 14:41:51 crc kubenswrapper[4907]: I1129 14:41:51.429906 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f5g84"] Nov 29 14:41:51 crc kubenswrapper[4907]: I1129 14:41:51.432672 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:41:51 crc kubenswrapper[4907]: I1129 14:41:51.450672 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f5g84"] Nov 29 14:41:51 crc kubenswrapper[4907]: I1129 14:41:51.525386 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c67b4696-5fc8-4648-b126-6c184880c981-catalog-content\") pod \"redhat-operators-f5g84\" (UID: \"c67b4696-5fc8-4648-b126-6c184880c981\") " pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:41:51 crc kubenswrapper[4907]: I1129 14:41:51.525627 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q75dw\" (UniqueName: \"kubernetes.io/projected/c67b4696-5fc8-4648-b126-6c184880c981-kube-api-access-q75dw\") pod \"redhat-operators-f5g84\" (UID: \"c67b4696-5fc8-4648-b126-6c184880c981\") " pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:41:51 crc kubenswrapper[4907]: I1129 14:41:51.525688 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c67b4696-5fc8-4648-b126-6c184880c981-utilities\") pod \"redhat-operators-f5g84\" (UID: \"c67b4696-5fc8-4648-b126-6c184880c981\") " pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:41:51 crc kubenswrapper[4907]: I1129 14:41:51.627437 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c67b4696-5fc8-4648-b126-6c184880c981-catalog-content\") pod \"redhat-operators-f5g84\" (UID: \"c67b4696-5fc8-4648-b126-6c184880c981\") " pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:41:51 crc kubenswrapper[4907]: I1129 14:41:51.627535 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q75dw\" (UniqueName: \"kubernetes.io/projected/c67b4696-5fc8-4648-b126-6c184880c981-kube-api-access-q75dw\") pod \"redhat-operators-f5g84\" (UID: \"c67b4696-5fc8-4648-b126-6c184880c981\") " pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:41:51 crc kubenswrapper[4907]: I1129 14:41:51.627582 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c67b4696-5fc8-4648-b126-6c184880c981-utilities\") pod \"redhat-operators-f5g84\" (UID: \"c67b4696-5fc8-4648-b126-6c184880c981\") " pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:41:51 crc kubenswrapper[4907]: I1129 14:41:51.628527 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c67b4696-5fc8-4648-b126-6c184880c981-utilities\") pod \"redhat-operators-f5g84\" (UID: \"c67b4696-5fc8-4648-b126-6c184880c981\") " pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:41:51 crc kubenswrapper[4907]: I1129 14:41:51.628542 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c67b4696-5fc8-4648-b126-6c184880c981-catalog-content\") pod \"redhat-operators-f5g84\" (UID: \"c67b4696-5fc8-4648-b126-6c184880c981\") " pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:41:51 crc kubenswrapper[4907]: I1129 14:41:51.662786 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q75dw\" (UniqueName: \"kubernetes.io/projected/c67b4696-5fc8-4648-b126-6c184880c981-kube-api-access-q75dw\") pod \"redhat-operators-f5g84\" (UID: \"c67b4696-5fc8-4648-b126-6c184880c981\") " pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:41:51 crc kubenswrapper[4907]: I1129 14:41:51.761137 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:41:52 crc kubenswrapper[4907]: I1129 14:41:52.044048 4907 generic.go:334] "Generic (PLEG): container finished" podID="665b9a04-0d24-45c5-9129-8d37342f2674" containerID="bcc4718c3a5936967c03c8fce261a06d0f0242dd42fc19160f9e85e16bb29a05" exitCode=0 Nov 29 14:41:52 crc kubenswrapper[4907]: I1129 14:41:52.044130 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" event={"ID":"665b9a04-0d24-45c5-9129-8d37342f2674","Type":"ContainerDied","Data":"bcc4718c3a5936967c03c8fce261a06d0f0242dd42fc19160f9e85e16bb29a05"} Nov 29 14:41:52 crc kubenswrapper[4907]: I1129 14:41:52.089278 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f5g84"] Nov 29 14:41:53 crc kubenswrapper[4907]: I1129 14:41:53.070217 4907 generic.go:334] "Generic (PLEG): container finished" podID="665b9a04-0d24-45c5-9129-8d37342f2674" containerID="116f8caee9b3c45a3598d4b974415c92ea5864157cdae42311edaf6e92c43a54" exitCode=0 Nov 29 14:41:53 crc kubenswrapper[4907]: I1129 14:41:53.070553 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" event={"ID":"665b9a04-0d24-45c5-9129-8d37342f2674","Type":"ContainerDied","Data":"116f8caee9b3c45a3598d4b974415c92ea5864157cdae42311edaf6e92c43a54"} Nov 29 14:41:53 crc kubenswrapper[4907]: I1129 14:41:53.073252 4907 generic.go:334] "Generic (PLEG): container finished" podID="d1016dfd-9651-4c1f-94f4-312c4eab6a00" containerID="4de0e5d5fad523add0fc5ac6397d84e261edbac52dc004caed9a307ceb044ea8" exitCode=0 Nov 29 14:41:53 crc kubenswrapper[4907]: I1129 14:41:53.073345 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" event={"ID":"d1016dfd-9651-4c1f-94f4-312c4eab6a00","Type":"ContainerDied","Data":"4de0e5d5fad523add0fc5ac6397d84e261edbac52dc004caed9a307ceb044ea8"} Nov 29 14:41:53 crc kubenswrapper[4907]: I1129 14:41:53.075346 4907 generic.go:334] "Generic (PLEG): container finished" podID="c67b4696-5fc8-4648-b126-6c184880c981" containerID="5e24c7ef5996c49181da49c3ea44a3abe824def3fd41257902782e1d1079ca01" exitCode=0 Nov 29 14:41:53 crc kubenswrapper[4907]: I1129 14:41:53.075409 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5g84" event={"ID":"c67b4696-5fc8-4648-b126-6c184880c981","Type":"ContainerDied","Data":"5e24c7ef5996c49181da49c3ea44a3abe824def3fd41257902782e1d1079ca01"} Nov 29 14:41:53 crc kubenswrapper[4907]: I1129 14:41:53.075478 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5g84" event={"ID":"c67b4696-5fc8-4648-b126-6c184880c981","Type":"ContainerStarted","Data":"3e43fb8207845aca5bf19b2cf20f1b5a84837f93161a597f0634b07e916b7a37"} Nov 29 14:41:54 crc kubenswrapper[4907]: I1129 14:41:54.089676 4907 generic.go:334] "Generic (PLEG): container finished" podID="d1016dfd-9651-4c1f-94f4-312c4eab6a00" containerID="8ea273844439865d723a50c8741d35e0c2f90de0e399cabe49ce139b8bfcfb97" exitCode=0 Nov 29 14:41:54 crc kubenswrapper[4907]: I1129 14:41:54.089747 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" event={"ID":"d1016dfd-9651-4c1f-94f4-312c4eab6a00","Type":"ContainerDied","Data":"8ea273844439865d723a50c8741d35e0c2f90de0e399cabe49ce139b8bfcfb97"} Nov 29 14:41:54 crc kubenswrapper[4907]: I1129 14:41:54.398177 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" Nov 29 14:41:54 crc kubenswrapper[4907]: I1129 14:41:54.586387 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq2gw\" (UniqueName: \"kubernetes.io/projected/665b9a04-0d24-45c5-9129-8d37342f2674-kube-api-access-kq2gw\") pod \"665b9a04-0d24-45c5-9129-8d37342f2674\" (UID: \"665b9a04-0d24-45c5-9129-8d37342f2674\") " Nov 29 14:41:54 crc kubenswrapper[4907]: I1129 14:41:54.586529 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/665b9a04-0d24-45c5-9129-8d37342f2674-util\") pod \"665b9a04-0d24-45c5-9129-8d37342f2674\" (UID: \"665b9a04-0d24-45c5-9129-8d37342f2674\") " Nov 29 14:41:54 crc kubenswrapper[4907]: I1129 14:41:54.595062 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/665b9a04-0d24-45c5-9129-8d37342f2674-bundle\") pod \"665b9a04-0d24-45c5-9129-8d37342f2674\" (UID: \"665b9a04-0d24-45c5-9129-8d37342f2674\") " Nov 29 14:41:54 crc kubenswrapper[4907]: I1129 14:41:54.597435 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/665b9a04-0d24-45c5-9129-8d37342f2674-bundle" (OuterVolumeSpecName: "bundle") pod "665b9a04-0d24-45c5-9129-8d37342f2674" (UID: "665b9a04-0d24-45c5-9129-8d37342f2674"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:41:54 crc kubenswrapper[4907]: I1129 14:41:54.598773 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/665b9a04-0d24-45c5-9129-8d37342f2674-kube-api-access-kq2gw" (OuterVolumeSpecName: "kube-api-access-kq2gw") pod "665b9a04-0d24-45c5-9129-8d37342f2674" (UID: "665b9a04-0d24-45c5-9129-8d37342f2674"). InnerVolumeSpecName "kube-api-access-kq2gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:41:54 crc kubenswrapper[4907]: I1129 14:41:54.697608 4907 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/665b9a04-0d24-45c5-9129-8d37342f2674-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:41:54 crc kubenswrapper[4907]: I1129 14:41:54.697663 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq2gw\" (UniqueName: \"kubernetes.io/projected/665b9a04-0d24-45c5-9129-8d37342f2674-kube-api-access-kq2gw\") on node \"crc\" DevicePath \"\"" Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.096571 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5g84" event={"ID":"c67b4696-5fc8-4648-b126-6c184880c981","Type":"ContainerStarted","Data":"80c7a8a0c952199b3d8279d0d927199cca4129069c486ca2e1077829ec66e3dc"} Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.099100 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" event={"ID":"665b9a04-0d24-45c5-9129-8d37342f2674","Type":"ContainerDied","Data":"8cc952256635c549709e6af54c11d960db6cdab1f6116d8f3c84422829254970"} Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.099132 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g" Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.099147 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cc952256635c549709e6af54c11d960db6cdab1f6116d8f3c84422829254970" Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.339039 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.406647 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1016dfd-9651-4c1f-94f4-312c4eab6a00-bundle\") pod \"d1016dfd-9651-4c1f-94f4-312c4eab6a00\" (UID: \"d1016dfd-9651-4c1f-94f4-312c4eab6a00\") " Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.406725 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1016dfd-9651-4c1f-94f4-312c4eab6a00-util\") pod \"d1016dfd-9651-4c1f-94f4-312c4eab6a00\" (UID: \"d1016dfd-9651-4c1f-94f4-312c4eab6a00\") " Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.406788 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfklr\" (UniqueName: \"kubernetes.io/projected/d1016dfd-9651-4c1f-94f4-312c4eab6a00-kube-api-access-pfklr\") pod \"d1016dfd-9651-4c1f-94f4-312c4eab6a00\" (UID: \"d1016dfd-9651-4c1f-94f4-312c4eab6a00\") " Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.407597 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1016dfd-9651-4c1f-94f4-312c4eab6a00-bundle" (OuterVolumeSpecName: "bundle") pod "d1016dfd-9651-4c1f-94f4-312c4eab6a00" (UID: "d1016dfd-9651-4c1f-94f4-312c4eab6a00"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.415404 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1016dfd-9651-4c1f-94f4-312c4eab6a00-kube-api-access-pfklr" (OuterVolumeSpecName: "kube-api-access-pfklr") pod "d1016dfd-9651-4c1f-94f4-312c4eab6a00" (UID: "d1016dfd-9651-4c1f-94f4-312c4eab6a00"). InnerVolumeSpecName "kube-api-access-pfklr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.421680 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/665b9a04-0d24-45c5-9129-8d37342f2674-util" (OuterVolumeSpecName: "util") pod "665b9a04-0d24-45c5-9129-8d37342f2674" (UID: "665b9a04-0d24-45c5-9129-8d37342f2674"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.427429 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1016dfd-9651-4c1f-94f4-312c4eab6a00-util" (OuterVolumeSpecName: "util") pod "d1016dfd-9651-4c1f-94f4-312c4eab6a00" (UID: "d1016dfd-9651-4c1f-94f4-312c4eab6a00"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.508636 4907 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1016dfd-9651-4c1f-94f4-312c4eab6a00-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.508673 4907 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1016dfd-9651-4c1f-94f4-312c4eab6a00-util\") on node \"crc\" DevicePath \"\"" Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.508687 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfklr\" (UniqueName: \"kubernetes.io/projected/d1016dfd-9651-4c1f-94f4-312c4eab6a00-kube-api-access-pfklr\") on node \"crc\" DevicePath \"\"" Nov 29 14:41:55 crc kubenswrapper[4907]: I1129 14:41:55.508703 4907 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/665b9a04-0d24-45c5-9129-8d37342f2674-util\") on node \"crc\" DevicePath \"\"" Nov 29 14:41:56 crc kubenswrapper[4907]: I1129 14:41:56.109586 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" event={"ID":"d1016dfd-9651-4c1f-94f4-312c4eab6a00","Type":"ContainerDied","Data":"f2e9a1d925ecc6f59c2dedaf4faa2307b3fa79f8008fbf5b8f6d7b6fa3a9c35c"} Nov 29 14:41:56 crc kubenswrapper[4907]: I1129 14:41:56.110651 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2e9a1d925ecc6f59c2dedaf4faa2307b3fa79f8008fbf5b8f6d7b6fa3a9c35c" Nov 29 14:41:56 crc kubenswrapper[4907]: I1129 14:41:56.109633 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p" Nov 29 14:41:56 crc kubenswrapper[4907]: I1129 14:41:56.111933 4907 generic.go:334] "Generic (PLEG): container finished" podID="c67b4696-5fc8-4648-b126-6c184880c981" containerID="80c7a8a0c952199b3d8279d0d927199cca4129069c486ca2e1077829ec66e3dc" exitCode=0 Nov 29 14:41:56 crc kubenswrapper[4907]: I1129 14:41:56.111995 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5g84" event={"ID":"c67b4696-5fc8-4648-b126-6c184880c981","Type":"ContainerDied","Data":"80c7a8a0c952199b3d8279d0d927199cca4129069c486ca2e1077829ec66e3dc"} Nov 29 14:41:57 crc kubenswrapper[4907]: I1129 14:41:57.122242 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5g84" event={"ID":"c67b4696-5fc8-4648-b126-6c184880c981","Type":"ContainerStarted","Data":"2ac1f2f36720ee50a0171b7f9ccdc79b2498e8d1c07420715b9a9c450749abad"} Nov 29 14:41:57 crc kubenswrapper[4907]: I1129 14:41:57.141613 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f5g84" podStartSLOduration=2.52308989 podStartE2EDuration="6.141595006s" podCreationTimestamp="2025-11-29 14:41:51 +0000 UTC" firstStartedPulling="2025-11-29 14:41:53.076459589 +0000 UTC m=+811.063297241" lastFinishedPulling="2025-11-29 14:41:56.694964665 +0000 UTC m=+814.681802357" observedRunningTime="2025-11-29 14:41:57.140546386 +0000 UTC m=+815.127384048" watchObservedRunningTime="2025-11-29 14:41:57.141595006 +0000 UTC m=+815.128432668" Nov 29 14:41:58 crc kubenswrapper[4907]: I1129 14:41:58.490102 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:41:58 crc kubenswrapper[4907]: I1129 14:41:58.490396 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:41:58 crc kubenswrapper[4907]: I1129 14:41:58.491937 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:41:58 crc kubenswrapper[4907]: I1129 14:41:58.492730 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6d0279b6c0a0b7cac049f6991025c0a86d66be3a78b2d46b37cde84b40abbc6"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 14:41:58 crc kubenswrapper[4907]: I1129 14:41:58.492836 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://f6d0279b6c0a0b7cac049f6991025c0a86d66be3a78b2d46b37cde84b40abbc6" gracePeriod=600 Nov 29 14:41:59 crc kubenswrapper[4907]: I1129 14:41:59.134879 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="f6d0279b6c0a0b7cac049f6991025c0a86d66be3a78b2d46b37cde84b40abbc6" exitCode=0 Nov 29 14:41:59 crc kubenswrapper[4907]: I1129 14:41:59.134943 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"f6d0279b6c0a0b7cac049f6991025c0a86d66be3a78b2d46b37cde84b40abbc6"} Nov 29 14:41:59 crc kubenswrapper[4907]: I1129 14:41:59.135217 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"81cf87bbb8090f9964b6b2dbf0b6be6946b5091a7113f8782940ac4da5885e64"} Nov 29 14:41:59 crc kubenswrapper[4907]: I1129 14:41:59.135254 4907 scope.go:117] "RemoveContainer" containerID="6266188cd3801cb79e9076cc411ccc7b4b18d94f48d528ad87b448fafd9cdc7d" Nov 29 14:42:01 crc kubenswrapper[4907]: I1129 14:42:01.761746 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:42:01 crc kubenswrapper[4907]: I1129 14:42:01.762693 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:42:02 crc kubenswrapper[4907]: I1129 14:42:02.829580 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f5g84" podUID="c67b4696-5fc8-4648-b126-6c184880c981" containerName="registry-server" probeResult="failure" output=< Nov 29 14:42:02 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 14:42:02 crc kubenswrapper[4907]: > Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.371281 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz"] Nov 29 14:42:04 crc kubenswrapper[4907]: E1129 14:42:04.372851 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1016dfd-9651-4c1f-94f4-312c4eab6a00" containerName="extract" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.372926 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1016dfd-9651-4c1f-94f4-312c4eab6a00" containerName="extract" Nov 29 14:42:04 crc kubenswrapper[4907]: E1129 14:42:04.373058 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="665b9a04-0d24-45c5-9129-8d37342f2674" containerName="util" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.373126 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="665b9a04-0d24-45c5-9129-8d37342f2674" containerName="util" Nov 29 14:42:04 crc kubenswrapper[4907]: E1129 14:42:04.373209 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1016dfd-9651-4c1f-94f4-312c4eab6a00" containerName="util" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.373264 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1016dfd-9651-4c1f-94f4-312c4eab6a00" containerName="util" Nov 29 14:42:04 crc kubenswrapper[4907]: E1129 14:42:04.373338 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="665b9a04-0d24-45c5-9129-8d37342f2674" containerName="extract" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.373387 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="665b9a04-0d24-45c5-9129-8d37342f2674" containerName="extract" Nov 29 14:42:04 crc kubenswrapper[4907]: E1129 14:42:04.373471 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1016dfd-9651-4c1f-94f4-312c4eab6a00" containerName="pull" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.373532 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1016dfd-9651-4c1f-94f4-312c4eab6a00" containerName="pull" Nov 29 14:42:04 crc kubenswrapper[4907]: E1129 14:42:04.373602 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="665b9a04-0d24-45c5-9129-8d37342f2674" containerName="pull" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.373664 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="665b9a04-0d24-45c5-9129-8d37342f2674" containerName="pull" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.373877 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="665b9a04-0d24-45c5-9129-8d37342f2674" containerName="extract" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.373950 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1016dfd-9651-4c1f-94f4-312c4eab6a00" containerName="extract" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.374934 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.378486 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.378550 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.378627 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.378651 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-xn59b" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.380963 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.385302 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.396097 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz"] Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.565007 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/faed25bd-9bb2-4409-927a-e70521fb534c-manager-config\") pod \"loki-operator-controller-manager-6ddbc98977-wnwpz\" (UID: \"faed25bd-9bb2-4409-927a-e70521fb534c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.565053 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/faed25bd-9bb2-4409-927a-e70521fb534c-apiservice-cert\") pod \"loki-operator-controller-manager-6ddbc98977-wnwpz\" (UID: \"faed25bd-9bb2-4409-927a-e70521fb534c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.565088 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/faed25bd-9bb2-4409-927a-e70521fb534c-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6ddbc98977-wnwpz\" (UID: \"faed25bd-9bb2-4409-927a-e70521fb534c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.565105 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffqhd\" (UniqueName: \"kubernetes.io/projected/faed25bd-9bb2-4409-927a-e70521fb534c-kube-api-access-ffqhd\") pod \"loki-operator-controller-manager-6ddbc98977-wnwpz\" (UID: \"faed25bd-9bb2-4409-927a-e70521fb534c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.565142 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/faed25bd-9bb2-4409-927a-e70521fb534c-webhook-cert\") pod \"loki-operator-controller-manager-6ddbc98977-wnwpz\" (UID: \"faed25bd-9bb2-4409-927a-e70521fb534c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.665727 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffqhd\" (UniqueName: \"kubernetes.io/projected/faed25bd-9bb2-4409-927a-e70521fb534c-kube-api-access-ffqhd\") pod \"loki-operator-controller-manager-6ddbc98977-wnwpz\" (UID: \"faed25bd-9bb2-4409-927a-e70521fb534c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.665789 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/faed25bd-9bb2-4409-927a-e70521fb534c-webhook-cert\") pod \"loki-operator-controller-manager-6ddbc98977-wnwpz\" (UID: \"faed25bd-9bb2-4409-927a-e70521fb534c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.665862 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/faed25bd-9bb2-4409-927a-e70521fb534c-manager-config\") pod \"loki-operator-controller-manager-6ddbc98977-wnwpz\" (UID: \"faed25bd-9bb2-4409-927a-e70521fb534c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.665891 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/faed25bd-9bb2-4409-927a-e70521fb534c-apiservice-cert\") pod \"loki-operator-controller-manager-6ddbc98977-wnwpz\" (UID: \"faed25bd-9bb2-4409-927a-e70521fb534c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.665922 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/faed25bd-9bb2-4409-927a-e70521fb534c-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6ddbc98977-wnwpz\" (UID: \"faed25bd-9bb2-4409-927a-e70521fb534c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.667081 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/faed25bd-9bb2-4409-927a-e70521fb534c-manager-config\") pod \"loki-operator-controller-manager-6ddbc98977-wnwpz\" (UID: \"faed25bd-9bb2-4409-927a-e70521fb534c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.675785 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/faed25bd-9bb2-4409-927a-e70521fb534c-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6ddbc98977-wnwpz\" (UID: \"faed25bd-9bb2-4409-927a-e70521fb534c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.683285 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/faed25bd-9bb2-4409-927a-e70521fb534c-webhook-cert\") pod \"loki-operator-controller-manager-6ddbc98977-wnwpz\" (UID: \"faed25bd-9bb2-4409-927a-e70521fb534c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.684003 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/faed25bd-9bb2-4409-927a-e70521fb534c-apiservice-cert\") pod \"loki-operator-controller-manager-6ddbc98977-wnwpz\" (UID: \"faed25bd-9bb2-4409-927a-e70521fb534c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.692225 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffqhd\" (UniqueName: \"kubernetes.io/projected/faed25bd-9bb2-4409-927a-e70521fb534c-kube-api-access-ffqhd\") pod \"loki-operator-controller-manager-6ddbc98977-wnwpz\" (UID: \"faed25bd-9bb2-4409-927a-e70521fb534c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:04 crc kubenswrapper[4907]: I1129 14:42:04.702078 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:05 crc kubenswrapper[4907]: I1129 14:42:05.143502 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz"] Nov 29 14:42:05 crc kubenswrapper[4907]: I1129 14:42:05.176052 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" event={"ID":"faed25bd-9bb2-4409-927a-e70521fb534c","Type":"ContainerStarted","Data":"1396ec575e6625441b5c23a6f96238e091596d2b2bdfb57efada7fe10648333a"} Nov 29 14:42:09 crc kubenswrapper[4907]: I1129 14:42:09.115912 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-22svz"] Nov 29 14:42:09 crc kubenswrapper[4907]: I1129 14:42:09.117286 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-22svz" Nov 29 14:42:09 crc kubenswrapper[4907]: I1129 14:42:09.121041 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Nov 29 14:42:09 crc kubenswrapper[4907]: I1129 14:42:09.121492 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-2crkr" Nov 29 14:42:09 crc kubenswrapper[4907]: I1129 14:42:09.121960 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Nov 29 14:42:09 crc kubenswrapper[4907]: I1129 14:42:09.145452 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-22svz"] Nov 29 14:42:09 crc kubenswrapper[4907]: I1129 14:42:09.254686 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfk28\" (UniqueName: \"kubernetes.io/projected/75775bda-f952-44db-a0c1-01993254453f-kube-api-access-rfk28\") pod \"cluster-logging-operator-ff9846bd-22svz\" (UID: \"75775bda-f952-44db-a0c1-01993254453f\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-22svz" Nov 29 14:42:09 crc kubenswrapper[4907]: I1129 14:42:09.356149 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfk28\" (UniqueName: \"kubernetes.io/projected/75775bda-f952-44db-a0c1-01993254453f-kube-api-access-rfk28\") pod \"cluster-logging-operator-ff9846bd-22svz\" (UID: \"75775bda-f952-44db-a0c1-01993254453f\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-22svz" Nov 29 14:42:09 crc kubenswrapper[4907]: I1129 14:42:09.380481 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfk28\" (UniqueName: \"kubernetes.io/projected/75775bda-f952-44db-a0c1-01993254453f-kube-api-access-rfk28\") pod \"cluster-logging-operator-ff9846bd-22svz\" (UID: \"75775bda-f952-44db-a0c1-01993254453f\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-22svz" Nov 29 14:42:09 crc kubenswrapper[4907]: I1129 14:42:09.446829 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-22svz" Nov 29 14:42:11 crc kubenswrapper[4907]: I1129 14:42:11.406533 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-22svz"] Nov 29 14:42:11 crc kubenswrapper[4907]: I1129 14:42:11.829895 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:42:11 crc kubenswrapper[4907]: I1129 14:42:11.912021 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:42:12 crc kubenswrapper[4907]: I1129 14:42:12.239286 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" event={"ID":"faed25bd-9bb2-4409-927a-e70521fb534c","Type":"ContainerStarted","Data":"dc7866d5e815f8b6d4b361f056312a6aaa8a5dd6dd42c2ef3d1befcfe8136b6a"} Nov 29 14:42:12 crc kubenswrapper[4907]: I1129 14:42:12.240477 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-22svz" event={"ID":"75775bda-f952-44db-a0c1-01993254453f","Type":"ContainerStarted","Data":"1c27aead7bd3862a3bdf3452f39c277d6b9e2919332d95442ff5ccbe41b4405f"} Nov 29 14:42:15 crc kubenswrapper[4907]: I1129 14:42:15.212091 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f5g84"] Nov 29 14:42:15 crc kubenswrapper[4907]: I1129 14:42:15.212907 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f5g84" podUID="c67b4696-5fc8-4648-b126-6c184880c981" containerName="registry-server" containerID="cri-o://2ac1f2f36720ee50a0171b7f9ccdc79b2498e8d1c07420715b9a9c450749abad" gracePeriod=2 Nov 29 14:42:15 crc kubenswrapper[4907]: I1129 14:42:15.726139 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:42:15 crc kubenswrapper[4907]: I1129 14:42:15.773066 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q75dw\" (UniqueName: \"kubernetes.io/projected/c67b4696-5fc8-4648-b126-6c184880c981-kube-api-access-q75dw\") pod \"c67b4696-5fc8-4648-b126-6c184880c981\" (UID: \"c67b4696-5fc8-4648-b126-6c184880c981\") " Nov 29 14:42:15 crc kubenswrapper[4907]: I1129 14:42:15.773346 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c67b4696-5fc8-4648-b126-6c184880c981-utilities\") pod \"c67b4696-5fc8-4648-b126-6c184880c981\" (UID: \"c67b4696-5fc8-4648-b126-6c184880c981\") " Nov 29 14:42:15 crc kubenswrapper[4907]: I1129 14:42:15.773380 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c67b4696-5fc8-4648-b126-6c184880c981-catalog-content\") pod \"c67b4696-5fc8-4648-b126-6c184880c981\" (UID: \"c67b4696-5fc8-4648-b126-6c184880c981\") " Nov 29 14:42:15 crc kubenswrapper[4907]: I1129 14:42:15.775047 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c67b4696-5fc8-4648-b126-6c184880c981-utilities" (OuterVolumeSpecName: "utilities") pod "c67b4696-5fc8-4648-b126-6c184880c981" (UID: "c67b4696-5fc8-4648-b126-6c184880c981"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:42:15 crc kubenswrapper[4907]: I1129 14:42:15.782416 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c67b4696-5fc8-4648-b126-6c184880c981-kube-api-access-q75dw" (OuterVolumeSpecName: "kube-api-access-q75dw") pod "c67b4696-5fc8-4648-b126-6c184880c981" (UID: "c67b4696-5fc8-4648-b126-6c184880c981"). InnerVolumeSpecName "kube-api-access-q75dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:42:15 crc kubenswrapper[4907]: I1129 14:42:15.876554 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c67b4696-5fc8-4648-b126-6c184880c981-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:42:15 crc kubenswrapper[4907]: I1129 14:42:15.876720 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q75dw\" (UniqueName: \"kubernetes.io/projected/c67b4696-5fc8-4648-b126-6c184880c981-kube-api-access-q75dw\") on node \"crc\" DevicePath \"\"" Nov 29 14:42:15 crc kubenswrapper[4907]: I1129 14:42:15.887002 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c67b4696-5fc8-4648-b126-6c184880c981-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c67b4696-5fc8-4648-b126-6c184880c981" (UID: "c67b4696-5fc8-4648-b126-6c184880c981"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:42:15 crc kubenswrapper[4907]: I1129 14:42:15.978236 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c67b4696-5fc8-4648-b126-6c184880c981-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:42:16 crc kubenswrapper[4907]: I1129 14:42:16.272432 4907 generic.go:334] "Generic (PLEG): container finished" podID="c67b4696-5fc8-4648-b126-6c184880c981" containerID="2ac1f2f36720ee50a0171b7f9ccdc79b2498e8d1c07420715b9a9c450749abad" exitCode=0 Nov 29 14:42:16 crc kubenswrapper[4907]: I1129 14:42:16.272499 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5g84" event={"ID":"c67b4696-5fc8-4648-b126-6c184880c981","Type":"ContainerDied","Data":"2ac1f2f36720ee50a0171b7f9ccdc79b2498e8d1c07420715b9a9c450749abad"} Nov 29 14:42:16 crc kubenswrapper[4907]: I1129 14:42:16.272532 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5g84" event={"ID":"c67b4696-5fc8-4648-b126-6c184880c981","Type":"ContainerDied","Data":"3e43fb8207845aca5bf19b2cf20f1b5a84837f93161a597f0634b07e916b7a37"} Nov 29 14:42:16 crc kubenswrapper[4907]: I1129 14:42:16.272554 4907 scope.go:117] "RemoveContainer" containerID="2ac1f2f36720ee50a0171b7f9ccdc79b2498e8d1c07420715b9a9c450749abad" Nov 29 14:42:16 crc kubenswrapper[4907]: I1129 14:42:16.272741 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5g84" Nov 29 14:42:16 crc kubenswrapper[4907]: I1129 14:42:16.339345 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f5g84"] Nov 29 14:42:16 crc kubenswrapper[4907]: I1129 14:42:16.352121 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f5g84"] Nov 29 14:42:16 crc kubenswrapper[4907]: I1129 14:42:16.485698 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c67b4696-5fc8-4648-b126-6c184880c981" path="/var/lib/kubelet/pods/c67b4696-5fc8-4648-b126-6c184880c981/volumes" Nov 29 14:42:19 crc kubenswrapper[4907]: I1129 14:42:19.187033 4907 scope.go:117] "RemoveContainer" containerID="80c7a8a0c952199b3d8279d0d927199cca4129069c486ca2e1077829ec66e3dc" Nov 29 14:42:21 crc kubenswrapper[4907]: I1129 14:42:21.406757 4907 scope.go:117] "RemoveContainer" containerID="5e24c7ef5996c49181da49c3ea44a3abe824def3fd41257902782e1d1079ca01" Nov 29 14:42:21 crc kubenswrapper[4907]: I1129 14:42:21.477617 4907 scope.go:117] "RemoveContainer" containerID="2ac1f2f36720ee50a0171b7f9ccdc79b2498e8d1c07420715b9a9c450749abad" Nov 29 14:42:21 crc kubenswrapper[4907]: E1129 14:42:21.478150 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ac1f2f36720ee50a0171b7f9ccdc79b2498e8d1c07420715b9a9c450749abad\": container with ID starting with 2ac1f2f36720ee50a0171b7f9ccdc79b2498e8d1c07420715b9a9c450749abad not found: ID does not exist" containerID="2ac1f2f36720ee50a0171b7f9ccdc79b2498e8d1c07420715b9a9c450749abad" Nov 29 14:42:21 crc kubenswrapper[4907]: I1129 14:42:21.478195 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ac1f2f36720ee50a0171b7f9ccdc79b2498e8d1c07420715b9a9c450749abad"} err="failed to get container status \"2ac1f2f36720ee50a0171b7f9ccdc79b2498e8d1c07420715b9a9c450749abad\": rpc error: code = NotFound desc = could not find container \"2ac1f2f36720ee50a0171b7f9ccdc79b2498e8d1c07420715b9a9c450749abad\": container with ID starting with 2ac1f2f36720ee50a0171b7f9ccdc79b2498e8d1c07420715b9a9c450749abad not found: ID does not exist" Nov 29 14:42:21 crc kubenswrapper[4907]: I1129 14:42:21.478226 4907 scope.go:117] "RemoveContainer" containerID="80c7a8a0c952199b3d8279d0d927199cca4129069c486ca2e1077829ec66e3dc" Nov 29 14:42:21 crc kubenswrapper[4907]: E1129 14:42:21.491646 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80c7a8a0c952199b3d8279d0d927199cca4129069c486ca2e1077829ec66e3dc\": container with ID starting with 80c7a8a0c952199b3d8279d0d927199cca4129069c486ca2e1077829ec66e3dc not found: ID does not exist" containerID="80c7a8a0c952199b3d8279d0d927199cca4129069c486ca2e1077829ec66e3dc" Nov 29 14:42:21 crc kubenswrapper[4907]: I1129 14:42:21.491702 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80c7a8a0c952199b3d8279d0d927199cca4129069c486ca2e1077829ec66e3dc"} err="failed to get container status \"80c7a8a0c952199b3d8279d0d927199cca4129069c486ca2e1077829ec66e3dc\": rpc error: code = NotFound desc = could not find container \"80c7a8a0c952199b3d8279d0d927199cca4129069c486ca2e1077829ec66e3dc\": container with ID starting with 80c7a8a0c952199b3d8279d0d927199cca4129069c486ca2e1077829ec66e3dc not found: ID does not exist" Nov 29 14:42:21 crc kubenswrapper[4907]: I1129 14:42:21.491729 4907 scope.go:117] "RemoveContainer" containerID="5e24c7ef5996c49181da49c3ea44a3abe824def3fd41257902782e1d1079ca01" Nov 29 14:42:21 crc kubenswrapper[4907]: E1129 14:42:21.492571 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e24c7ef5996c49181da49c3ea44a3abe824def3fd41257902782e1d1079ca01\": container with ID starting with 5e24c7ef5996c49181da49c3ea44a3abe824def3fd41257902782e1d1079ca01 not found: ID does not exist" containerID="5e24c7ef5996c49181da49c3ea44a3abe824def3fd41257902782e1d1079ca01" Nov 29 14:42:21 crc kubenswrapper[4907]: I1129 14:42:21.492662 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e24c7ef5996c49181da49c3ea44a3abe824def3fd41257902782e1d1079ca01"} err="failed to get container status \"5e24c7ef5996c49181da49c3ea44a3abe824def3fd41257902782e1d1079ca01\": rpc error: code = NotFound desc = could not find container \"5e24c7ef5996c49181da49c3ea44a3abe824def3fd41257902782e1d1079ca01\": container with ID starting with 5e24c7ef5996c49181da49c3ea44a3abe824def3fd41257902782e1d1079ca01 not found: ID does not exist" Nov 29 14:42:22 crc kubenswrapper[4907]: I1129 14:42:22.313104 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" event={"ID":"faed25bd-9bb2-4409-927a-e70521fb534c","Type":"ContainerStarted","Data":"a319553c9d52b8963571b2305bb5ae27b2c4cb878580734178efa7cd3810498c"} Nov 29 14:42:22 crc kubenswrapper[4907]: I1129 14:42:22.313818 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:22 crc kubenswrapper[4907]: I1129 14:42:22.316210 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" Nov 29 14:42:22 crc kubenswrapper[4907]: I1129 14:42:22.316532 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-22svz" event={"ID":"75775bda-f952-44db-a0c1-01993254453f","Type":"ContainerStarted","Data":"63d9be331373eaa1b4225aac0d1585cc6a7fb5859f40c4c89b7cf21e6579cdec"} Nov 29 14:42:22 crc kubenswrapper[4907]: I1129 14:42:22.336357 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-6ddbc98977-wnwpz" podStartSLOduration=1.9962875329999998 podStartE2EDuration="18.336339872s" podCreationTimestamp="2025-11-29 14:42:04 +0000 UTC" firstStartedPulling="2025-11-29 14:42:05.151627131 +0000 UTC m=+823.138464783" lastFinishedPulling="2025-11-29 14:42:21.49167946 +0000 UTC m=+839.478517122" observedRunningTime="2025-11-29 14:42:22.335182799 +0000 UTC m=+840.322020471" watchObservedRunningTime="2025-11-29 14:42:22.336339872 +0000 UTC m=+840.323177524" Nov 29 14:42:22 crc kubenswrapper[4907]: I1129 14:42:22.382879 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-ff9846bd-22svz" podStartSLOduration=3.383189 podStartE2EDuration="13.382854513s" podCreationTimestamp="2025-11-29 14:42:09 +0000 UTC" firstStartedPulling="2025-11-29 14:42:11.419311628 +0000 UTC m=+829.406149270" lastFinishedPulling="2025-11-29 14:42:21.418977121 +0000 UTC m=+839.405814783" observedRunningTime="2025-11-29 14:42:22.377516093 +0000 UTC m=+840.364353765" watchObservedRunningTime="2025-11-29 14:42:22.382854513 +0000 UTC m=+840.369692165" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.358375 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Nov 29 14:42:27 crc kubenswrapper[4907]: E1129 14:42:27.359078 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c67b4696-5fc8-4648-b126-6c184880c981" containerName="registry-server" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.359090 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67b4696-5fc8-4648-b126-6c184880c981" containerName="registry-server" Nov 29 14:42:27 crc kubenswrapper[4907]: E1129 14:42:27.359105 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c67b4696-5fc8-4648-b126-6c184880c981" containerName="extract-content" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.359111 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67b4696-5fc8-4648-b126-6c184880c981" containerName="extract-content" Nov 29 14:42:27 crc kubenswrapper[4907]: E1129 14:42:27.359139 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c67b4696-5fc8-4648-b126-6c184880c981" containerName="extract-utilities" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.359145 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67b4696-5fc8-4648-b126-6c184880c981" containerName="extract-utilities" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.359307 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c67b4696-5fc8-4648-b126-6c184880c981" containerName="registry-server" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.359755 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.362038 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.362836 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.387940 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.464575 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-befefd19-047d-4b46-9ccf-e13a657ae934\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-befefd19-047d-4b46-9ccf-e13a657ae934\") pod \"minio\" (UID: \"c1dbadd6-6fcb-4869-9335-327bc085f814\") " pod="minio-dev/minio" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.464681 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmsg8\" (UniqueName: \"kubernetes.io/projected/c1dbadd6-6fcb-4869-9335-327bc085f814-kube-api-access-qmsg8\") pod \"minio\" (UID: \"c1dbadd6-6fcb-4869-9335-327bc085f814\") " pod="minio-dev/minio" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.566600 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmsg8\" (UniqueName: \"kubernetes.io/projected/c1dbadd6-6fcb-4869-9335-327bc085f814-kube-api-access-qmsg8\") pod \"minio\" (UID: \"c1dbadd6-6fcb-4869-9335-327bc085f814\") " pod="minio-dev/minio" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.566703 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-befefd19-047d-4b46-9ccf-e13a657ae934\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-befefd19-047d-4b46-9ccf-e13a657ae934\") pod \"minio\" (UID: \"c1dbadd6-6fcb-4869-9335-327bc085f814\") " pod="minio-dev/minio" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.570533 4907 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.570581 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-befefd19-047d-4b46-9ccf-e13a657ae934\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-befefd19-047d-4b46-9ccf-e13a657ae934\") pod \"minio\" (UID: \"c1dbadd6-6fcb-4869-9335-327bc085f814\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/16d3d1827490bfbe5650d84c62d9e18aedc4fd95656be0c506737f9ce4181853/globalmount\"" pod="minio-dev/minio" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.620068 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmsg8\" (UniqueName: \"kubernetes.io/projected/c1dbadd6-6fcb-4869-9335-327bc085f814-kube-api-access-qmsg8\") pod \"minio\" (UID: \"c1dbadd6-6fcb-4869-9335-327bc085f814\") " pod="minio-dev/minio" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.624401 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-befefd19-047d-4b46-9ccf-e13a657ae934\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-befefd19-047d-4b46-9ccf-e13a657ae934\") pod \"minio\" (UID: \"c1dbadd6-6fcb-4869-9335-327bc085f814\") " pod="minio-dev/minio" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.687754 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Nov 29 14:42:27 crc kubenswrapper[4907]: I1129 14:42:27.985337 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Nov 29 14:42:28 crc kubenswrapper[4907]: I1129 14:42:28.361311 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"c1dbadd6-6fcb-4869-9335-327bc085f814","Type":"ContainerStarted","Data":"3888a0216a9d804c507d3a18ab3426ca6cfb4c00996911a5d7614f4a9d7f4448"} Nov 29 14:42:33 crc kubenswrapper[4907]: I1129 14:42:33.396163 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"c1dbadd6-6fcb-4869-9335-327bc085f814","Type":"ContainerStarted","Data":"7f9bc883feb39db69ec236f80226dd6f7229a593161035de837dbaa68408b79a"} Nov 29 14:42:33 crc kubenswrapper[4907]: I1129 14:42:33.426629 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.221055522 podStartE2EDuration="8.426601427s" podCreationTimestamp="2025-11-29 14:42:25 +0000 UTC" firstStartedPulling="2025-11-29 14:42:28.009172989 +0000 UTC m=+845.996010641" lastFinishedPulling="2025-11-29 14:42:32.214718894 +0000 UTC m=+850.201556546" observedRunningTime="2025-11-29 14:42:33.416747479 +0000 UTC m=+851.403585171" watchObservedRunningTime="2025-11-29 14:42:33.426601427 +0000 UTC m=+851.413439109" Nov 29 14:42:35 crc kubenswrapper[4907]: I1129 14:42:35.834293 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f9t4r"] Nov 29 14:42:35 crc kubenswrapper[4907]: I1129 14:42:35.836196 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:35 crc kubenswrapper[4907]: I1129 14:42:35.851675 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9t4r"] Nov 29 14:42:36 crc kubenswrapper[4907]: I1129 14:42:36.025795 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d74aa1-9372-4d35-be6f-e26c62642bb8-catalog-content\") pod \"redhat-marketplace-f9t4r\" (UID: \"55d74aa1-9372-4d35-be6f-e26c62642bb8\") " pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:36 crc kubenswrapper[4907]: I1129 14:42:36.026069 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d74aa1-9372-4d35-be6f-e26c62642bb8-utilities\") pod \"redhat-marketplace-f9t4r\" (UID: \"55d74aa1-9372-4d35-be6f-e26c62642bb8\") " pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:36 crc kubenswrapper[4907]: I1129 14:42:36.026095 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6vcj\" (UniqueName: \"kubernetes.io/projected/55d74aa1-9372-4d35-be6f-e26c62642bb8-kube-api-access-g6vcj\") pod \"redhat-marketplace-f9t4r\" (UID: \"55d74aa1-9372-4d35-be6f-e26c62642bb8\") " pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:36 crc kubenswrapper[4907]: I1129 14:42:36.127287 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d74aa1-9372-4d35-be6f-e26c62642bb8-catalog-content\") pod \"redhat-marketplace-f9t4r\" (UID: \"55d74aa1-9372-4d35-be6f-e26c62642bb8\") " pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:36 crc kubenswrapper[4907]: I1129 14:42:36.127403 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d74aa1-9372-4d35-be6f-e26c62642bb8-utilities\") pod \"redhat-marketplace-f9t4r\" (UID: \"55d74aa1-9372-4d35-be6f-e26c62642bb8\") " pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:36 crc kubenswrapper[4907]: I1129 14:42:36.128050 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d74aa1-9372-4d35-be6f-e26c62642bb8-catalog-content\") pod \"redhat-marketplace-f9t4r\" (UID: \"55d74aa1-9372-4d35-be6f-e26c62642bb8\") " pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:36 crc kubenswrapper[4907]: I1129 14:42:36.128125 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6vcj\" (UniqueName: \"kubernetes.io/projected/55d74aa1-9372-4d35-be6f-e26c62642bb8-kube-api-access-g6vcj\") pod \"redhat-marketplace-f9t4r\" (UID: \"55d74aa1-9372-4d35-be6f-e26c62642bb8\") " pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:36 crc kubenswrapper[4907]: I1129 14:42:36.128194 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d74aa1-9372-4d35-be6f-e26c62642bb8-utilities\") pod \"redhat-marketplace-f9t4r\" (UID: \"55d74aa1-9372-4d35-be6f-e26c62642bb8\") " pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:36 crc kubenswrapper[4907]: I1129 14:42:36.144943 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6vcj\" (UniqueName: \"kubernetes.io/projected/55d74aa1-9372-4d35-be6f-e26c62642bb8-kube-api-access-g6vcj\") pod \"redhat-marketplace-f9t4r\" (UID: \"55d74aa1-9372-4d35-be6f-e26c62642bb8\") " pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:36 crc kubenswrapper[4907]: I1129 14:42:36.196079 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:36 crc kubenswrapper[4907]: I1129 14:42:36.616178 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9t4r"] Nov 29 14:42:36 crc kubenswrapper[4907]: W1129 14:42:36.624353 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55d74aa1_9372_4d35_be6f_e26c62642bb8.slice/crio-aa3e8badd69f8b880f600aa52dbca0abf9696a35f83d0820cd859574aefdd40c WatchSource:0}: Error finding container aa3e8badd69f8b880f600aa52dbca0abf9696a35f83d0820cd859574aefdd40c: Status 404 returned error can't find the container with id aa3e8badd69f8b880f600aa52dbca0abf9696a35f83d0820cd859574aefdd40c Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.202154 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c"] Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.203793 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.207690 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-w7kdb" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.208105 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.209273 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.210124 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.219317 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.224006 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c"] Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.344379 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d3a7f1-ba2a-4744-937a-4bf219bb85ab-config\") pod \"logging-loki-distributor-76cc67bf56-fxc7c\" (UID: \"54d3a7f1-ba2a-4744-937a-4bf219bb85ab\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.344414 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6gp4\" (UniqueName: \"kubernetes.io/projected/54d3a7f1-ba2a-4744-937a-4bf219bb85ab-kube-api-access-w6gp4\") pod \"logging-loki-distributor-76cc67bf56-fxc7c\" (UID: \"54d3a7f1-ba2a-4744-937a-4bf219bb85ab\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.344446 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/54d3a7f1-ba2a-4744-937a-4bf219bb85ab-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-fxc7c\" (UID: \"54d3a7f1-ba2a-4744-937a-4bf219bb85ab\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.344475 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/54d3a7f1-ba2a-4744-937a-4bf219bb85ab-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-fxc7c\" (UID: \"54d3a7f1-ba2a-4744-937a-4bf219bb85ab\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.344492 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54d3a7f1-ba2a-4744-937a-4bf219bb85ab-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-fxc7c\" (UID: \"54d3a7f1-ba2a-4744-937a-4bf219bb85ab\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.375873 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-dmfpd"] Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.376760 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.380319 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.386889 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.387778 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.395118 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-dmfpd"] Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.421647 4907 generic.go:334] "Generic (PLEG): container finished" podID="55d74aa1-9372-4d35-be6f-e26c62642bb8" containerID="91eb8a3e8bf24d6d17b9db5c904c77f83a0403db94c53b06ebddc4aea9bf365b" exitCode=0 Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.421693 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9t4r" event={"ID":"55d74aa1-9372-4d35-be6f-e26c62642bb8","Type":"ContainerDied","Data":"91eb8a3e8bf24d6d17b9db5c904c77f83a0403db94c53b06ebddc4aea9bf365b"} Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.421721 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9t4r" event={"ID":"55d74aa1-9372-4d35-be6f-e26c62642bb8","Type":"ContainerStarted","Data":"aa3e8badd69f8b880f600aa52dbca0abf9696a35f83d0820cd859574aefdd40c"} Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.446227 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d3a7f1-ba2a-4744-937a-4bf219bb85ab-config\") pod \"logging-loki-distributor-76cc67bf56-fxc7c\" (UID: \"54d3a7f1-ba2a-4744-937a-4bf219bb85ab\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.446270 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6gp4\" (UniqueName: \"kubernetes.io/projected/54d3a7f1-ba2a-4744-937a-4bf219bb85ab-kube-api-access-w6gp4\") pod \"logging-loki-distributor-76cc67bf56-fxc7c\" (UID: \"54d3a7f1-ba2a-4744-937a-4bf219bb85ab\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.446290 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/54d3a7f1-ba2a-4744-937a-4bf219bb85ab-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-fxc7c\" (UID: \"54d3a7f1-ba2a-4744-937a-4bf219bb85ab\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.446325 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/54d3a7f1-ba2a-4744-937a-4bf219bb85ab-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-fxc7c\" (UID: \"54d3a7f1-ba2a-4744-937a-4bf219bb85ab\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.446371 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54d3a7f1-ba2a-4744-937a-4bf219bb85ab-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-fxc7c\" (UID: \"54d3a7f1-ba2a-4744-937a-4bf219bb85ab\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.447622 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54d3a7f1-ba2a-4744-937a-4bf219bb85ab-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-fxc7c\" (UID: \"54d3a7f1-ba2a-4744-937a-4bf219bb85ab\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.447804 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d3a7f1-ba2a-4744-937a-4bf219bb85ab-config\") pod \"logging-loki-distributor-76cc67bf56-fxc7c\" (UID: \"54d3a7f1-ba2a-4744-937a-4bf219bb85ab\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.454154 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/54d3a7f1-ba2a-4744-937a-4bf219bb85ab-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-fxc7c\" (UID: \"54d3a7f1-ba2a-4744-937a-4bf219bb85ab\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.471115 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6gp4\" (UniqueName: \"kubernetes.io/projected/54d3a7f1-ba2a-4744-937a-4bf219bb85ab-kube-api-access-w6gp4\") pod \"logging-loki-distributor-76cc67bf56-fxc7c\" (UID: \"54d3a7f1-ba2a-4744-937a-4bf219bb85ab\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.477109 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/54d3a7f1-ba2a-4744-937a-4bf219bb85ab-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-fxc7c\" (UID: \"54d3a7f1-ba2a-4744-937a-4bf219bb85ab\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.526728 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.549774 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-config\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.549853 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.549886 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccpdf\" (UniqueName: \"kubernetes.io/projected/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-kube-api-access-ccpdf\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.549954 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.550041 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.550094 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.576031 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl"] Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.577420 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.580290 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.584689 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.591283 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl"] Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.651640 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.652337 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.652395 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-config\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.652453 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.652507 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccpdf\" (UniqueName: \"kubernetes.io/projected/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-kube-api-access-ccpdf\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.652651 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.653344 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.653509 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-config\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.657430 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.658185 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.659715 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.677014 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccpdf\" (UniqueName: \"kubernetes.io/projected/53bdaeef-1d57-48e5-8b2d-bc9edacd5351-kube-api-access-ccpdf\") pod \"logging-loki-querier-5895d59bb8-dmfpd\" (UID: \"53bdaeef-1d57-48e5-8b2d-bc9edacd5351\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.697050 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.756242 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34885590-0043-44b5-be42-f726d65f8487-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-xpnnl\" (UID: \"34885590-0043-44b5-be42-f726d65f8487\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.756284 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/34885590-0043-44b5-be42-f726d65f8487-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-xpnnl\" (UID: \"34885590-0043-44b5-be42-f726d65f8487\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.756306 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/34885590-0043-44b5-be42-f726d65f8487-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-xpnnl\" (UID: \"34885590-0043-44b5-be42-f726d65f8487\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.756481 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34885590-0043-44b5-be42-f726d65f8487-config\") pod \"logging-loki-query-frontend-84558f7c9f-xpnnl\" (UID: \"34885590-0043-44b5-be42-f726d65f8487\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.756599 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8drdt\" (UniqueName: \"kubernetes.io/projected/34885590-0043-44b5-be42-f726d65f8487-kube-api-access-8drdt\") pod \"logging-loki-query-frontend-84558f7c9f-xpnnl\" (UID: \"34885590-0043-44b5-be42-f726d65f8487\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.773301 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-586bf9b9f5-7865w"] Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.774832 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.776658 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.776738 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.777089 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.777232 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.777353 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.778198 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p"] Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.779771 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.781408 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-6zfc8" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.803463 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-586bf9b9f5-7865w"] Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.812622 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p"] Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.858287 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/34885590-0043-44b5-be42-f726d65f8487-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-xpnnl\" (UID: \"34885590-0043-44b5-be42-f726d65f8487\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.858380 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34885590-0043-44b5-be42-f726d65f8487-config\") pod \"logging-loki-query-frontend-84558f7c9f-xpnnl\" (UID: \"34885590-0043-44b5-be42-f726d65f8487\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.858469 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8drdt\" (UniqueName: \"kubernetes.io/projected/34885590-0043-44b5-be42-f726d65f8487-kube-api-access-8drdt\") pod \"logging-loki-query-frontend-84558f7c9f-xpnnl\" (UID: \"34885590-0043-44b5-be42-f726d65f8487\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.858530 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34885590-0043-44b5-be42-f726d65f8487-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-xpnnl\" (UID: \"34885590-0043-44b5-be42-f726d65f8487\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.858553 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/34885590-0043-44b5-be42-f726d65f8487-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-xpnnl\" (UID: \"34885590-0043-44b5-be42-f726d65f8487\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.859868 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34885590-0043-44b5-be42-f726d65f8487-config\") pod \"logging-loki-query-frontend-84558f7c9f-xpnnl\" (UID: \"34885590-0043-44b5-be42-f726d65f8487\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.861723 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34885590-0043-44b5-be42-f726d65f8487-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-xpnnl\" (UID: \"34885590-0043-44b5-be42-f726d65f8487\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.870381 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/34885590-0043-44b5-be42-f726d65f8487-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-xpnnl\" (UID: \"34885590-0043-44b5-be42-f726d65f8487\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.879478 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/34885590-0043-44b5-be42-f726d65f8487-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-xpnnl\" (UID: \"34885590-0043-44b5-be42-f726d65f8487\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.881661 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8drdt\" (UniqueName: \"kubernetes.io/projected/34885590-0043-44b5-be42-f726d65f8487-kube-api-access-8drdt\") pod \"logging-loki-query-frontend-84558f7c9f-xpnnl\" (UID: \"34885590-0043-44b5-be42-f726d65f8487\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.920250 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.968160 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/fb80f35c-6683-4296-afd9-a0895e860a3d-tenants\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.968337 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/53107106-32ab-4c46-949f-094abb62ce68-tls-secret\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.970046 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/53107106-32ab-4c46-949f-094abb62ce68-rbac\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.970089 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgwrp\" (UniqueName: \"kubernetes.io/projected/53107106-32ab-4c46-949f-094abb62ce68-kube-api-access-hgwrp\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.970144 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53107106-32ab-4c46-949f-094abb62ce68-logging-loki-ca-bundle\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.970181 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb80f35c-6683-4296-afd9-a0895e860a3d-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.970232 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/53107106-32ab-4c46-949f-094abb62ce68-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.970255 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/53107106-32ab-4c46-949f-094abb62ce68-lokistack-gateway\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.970294 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/fb80f35c-6683-4296-afd9-a0895e860a3d-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.970314 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/53107106-32ab-4c46-949f-094abb62ce68-tenants\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.970337 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/fb80f35c-6683-4296-afd9-a0895e860a3d-rbac\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.970357 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53107106-32ab-4c46-949f-094abb62ce68-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.970402 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/fb80f35c-6683-4296-afd9-a0895e860a3d-lokistack-gateway\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.970524 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fb80f35c-6683-4296-afd9-a0895e860a3d-tls-secret\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.970551 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s2hx\" (UniqueName: \"kubernetes.io/projected/fb80f35c-6683-4296-afd9-a0895e860a3d-kube-api-access-5s2hx\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:37 crc kubenswrapper[4907]: I1129 14:42:37.970597 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb80f35c-6683-4296-afd9-a0895e860a3d-logging-loki-ca-bundle\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.070975 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/fb80f35c-6683-4296-afd9-a0895e860a3d-tenants\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.071012 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/53107106-32ab-4c46-949f-094abb62ce68-tls-secret\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.071042 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/53107106-32ab-4c46-949f-094abb62ce68-rbac\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.071058 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgwrp\" (UniqueName: \"kubernetes.io/projected/53107106-32ab-4c46-949f-094abb62ce68-kube-api-access-hgwrp\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.071077 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53107106-32ab-4c46-949f-094abb62ce68-logging-loki-ca-bundle\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.071095 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb80f35c-6683-4296-afd9-a0895e860a3d-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.071124 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/53107106-32ab-4c46-949f-094abb62ce68-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.071141 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/53107106-32ab-4c46-949f-094abb62ce68-lokistack-gateway\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.071165 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/fb80f35c-6683-4296-afd9-a0895e860a3d-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.071181 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/53107106-32ab-4c46-949f-094abb62ce68-tenants\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.071199 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/fb80f35c-6683-4296-afd9-a0895e860a3d-rbac\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.071215 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53107106-32ab-4c46-949f-094abb62ce68-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.071241 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/fb80f35c-6683-4296-afd9-a0895e860a3d-lokistack-gateway\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.071280 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fb80f35c-6683-4296-afd9-a0895e860a3d-tls-secret\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.071300 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2hx\" (UniqueName: \"kubernetes.io/projected/fb80f35c-6683-4296-afd9-a0895e860a3d-kube-api-access-5s2hx\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.071318 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb80f35c-6683-4296-afd9-a0895e860a3d-logging-loki-ca-bundle\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.072245 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb80f35c-6683-4296-afd9-a0895e860a3d-logging-loki-ca-bundle\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.073046 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/53107106-32ab-4c46-949f-094abb62ce68-lokistack-gateway\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.073796 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53107106-32ab-4c46-949f-094abb62ce68-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.075115 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/fb80f35c-6683-4296-afd9-a0895e860a3d-rbac\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.075903 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53107106-32ab-4c46-949f-094abb62ce68-logging-loki-ca-bundle\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.076337 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/53107106-32ab-4c46-949f-094abb62ce68-tenants\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.076577 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb80f35c-6683-4296-afd9-a0895e860a3d-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.077115 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/fb80f35c-6683-4296-afd9-a0895e860a3d-tenants\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.077195 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/fb80f35c-6683-4296-afd9-a0895e860a3d-lokistack-gateway\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.078642 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/53107106-32ab-4c46-949f-094abb62ce68-rbac\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.081988 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/fb80f35c-6683-4296-afd9-a0895e860a3d-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.084262 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/53107106-32ab-4c46-949f-094abb62ce68-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.085109 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/53107106-32ab-4c46-949f-094abb62ce68-tls-secret\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.085189 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fb80f35c-6683-4296-afd9-a0895e860a3d-tls-secret\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.113964 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s2hx\" (UniqueName: \"kubernetes.io/projected/fb80f35c-6683-4296-afd9-a0895e860a3d-kube-api-access-5s2hx\") pod \"logging-loki-gateway-586bf9b9f5-wlw8p\" (UID: \"fb80f35c-6683-4296-afd9-a0895e860a3d\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.123114 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgwrp\" (UniqueName: \"kubernetes.io/projected/53107106-32ab-4c46-949f-094abb62ce68-kube-api-access-hgwrp\") pod \"logging-loki-gateway-586bf9b9f5-7865w\" (UID: \"53107106-32ab-4c46-949f-094abb62ce68\") " pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.142863 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c"] Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.199159 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-dmfpd"] Nov 29 14:42:38 crc kubenswrapper[4907]: W1129 14:42:38.216645 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53bdaeef_1d57_48e5_8b2d_bc9edacd5351.slice/crio-2203e3b8f7777c4c58316955649e7ce33999c83d6765feeecc8bb3f1eda90604 WatchSource:0}: Error finding container 2203e3b8f7777c4c58316955649e7ce33999c83d6765feeecc8bb3f1eda90604: Status 404 returned error can't find the container with id 2203e3b8f7777c4c58316955649e7ce33999c83d6765feeecc8bb3f1eda90604 Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.366501 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl"] Nov 29 14:42:38 crc kubenswrapper[4907]: W1129 14:42:38.369821 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34885590_0043_44b5_be42_f726d65f8487.slice/crio-5d35b9e1d21c5f16578f1891686b88fa4cc5fbb57655345714a5a63ebdb22539 WatchSource:0}: Error finding container 5d35b9e1d21c5f16578f1891686b88fa4cc5fbb57655345714a5a63ebdb22539: Status 404 returned error can't find the container with id 5d35b9e1d21c5f16578f1891686b88fa4cc5fbb57655345714a5a63ebdb22539 Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.393366 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.394223 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.396012 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.397004 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.397748 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.410992 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.413007 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.452428 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.467687 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" event={"ID":"53bdaeef-1d57-48e5-8b2d-bc9edacd5351","Type":"ContainerStarted","Data":"2203e3b8f7777c4c58316955649e7ce33999c83d6765feeecc8bb3f1eda90604"} Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.467751 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" event={"ID":"54d3a7f1-ba2a-4744-937a-4bf219bb85ab","Type":"ContainerStarted","Data":"f83cb10c814ee81fefe8155836d90662a5998a85c5e21cf5dd3129adb05a67ce"} Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.467878 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.475266 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.475546 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.497853 4907 generic.go:334] "Generic (PLEG): container finished" podID="55d74aa1-9372-4d35-be6f-e26c62642bb8" containerID="443705100ec156ce5ea21d0835922649124fe2066372e7853bbbbf80f11c4a0f" exitCode=0 Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.509145 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9t4r" event={"ID":"55d74aa1-9372-4d35-be6f-e26c62642bb8","Type":"ContainerDied","Data":"443705100ec156ce5ea21d0835922649124fe2066372e7853bbbbf80f11c4a0f"} Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.509218 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" event={"ID":"34885590-0043-44b5-be42-f726d65f8487","Type":"ContainerStarted","Data":"5d35b9e1d21c5f16578f1891686b88fa4cc5fbb57655345714a5a63ebdb22539"} Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.509244 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.593417 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7eaed1d2-efc5-4a55-a12d-9652e3343ff1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eaed1d2-efc5-4a55-a12d-9652e3343ff1\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.593528 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1395c265-394b-4f9d-9bab-ebdff563a7b2-config\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.593554 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e05f45d-0f5c-45a0-81cb-673104c0f806-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.593582 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/1395c265-394b-4f9d-9bab-ebdff563a7b2-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.593600 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/7e05f45d-0f5c-45a0-81cb-673104c0f806-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.593657 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1395c265-394b-4f9d-9bab-ebdff563a7b2-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.593675 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msd74\" (UniqueName: \"kubernetes.io/projected/1395c265-394b-4f9d-9bab-ebdff563a7b2-kube-api-access-msd74\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.593745 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8a914a46-0246-4e10-a8b3-91d7d55de36a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a914a46-0246-4e10-a8b3-91d7d55de36a\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.593861 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/7e05f45d-0f5c-45a0-81cb-673104c0f806-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.593907 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/7e05f45d-0f5c-45a0-81cb-673104c0f806-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.593936 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/1395c265-394b-4f9d-9bab-ebdff563a7b2-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.593978 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3cab4d62-f767-41f3-a510-53413927078a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3cab4d62-f767-41f3-a510-53413927078a\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.594001 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e05f45d-0f5c-45a0-81cb-673104c0f806-config\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.594020 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/1395c265-394b-4f9d-9bab-ebdff563a7b2-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.594061 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t269\" (UniqueName: \"kubernetes.io/projected/7e05f45d-0f5c-45a0-81cb-673104c0f806-kube-api-access-4t269\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.696468 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1395c265-394b-4f9d-9bab-ebdff563a7b2-config\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.696543 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e05f45d-0f5c-45a0-81cb-673104c0f806-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.696576 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/1395c265-394b-4f9d-9bab-ebdff563a7b2-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.698777 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e05f45d-0f5c-45a0-81cb-673104c0f806-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.698808 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1395c265-394b-4f9d-9bab-ebdff563a7b2-config\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.700635 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/7e05f45d-0f5c-45a0-81cb-673104c0f806-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.700752 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1395c265-394b-4f9d-9bab-ebdff563a7b2-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.700775 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msd74\" (UniqueName: \"kubernetes.io/projected/1395c265-394b-4f9d-9bab-ebdff563a7b2-kube-api-access-msd74\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.700810 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8a914a46-0246-4e10-a8b3-91d7d55de36a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a914a46-0246-4e10-a8b3-91d7d55de36a\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.700942 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/7e05f45d-0f5c-45a0-81cb-673104c0f806-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.700974 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/7e05f45d-0f5c-45a0-81cb-673104c0f806-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.701056 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/1395c265-394b-4f9d-9bab-ebdff563a7b2-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.701078 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3cab4d62-f767-41f3-a510-53413927078a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3cab4d62-f767-41f3-a510-53413927078a\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.701122 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e05f45d-0f5c-45a0-81cb-673104c0f806-config\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.701176 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/1395c265-394b-4f9d-9bab-ebdff563a7b2-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.701198 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t269\" (UniqueName: \"kubernetes.io/projected/7e05f45d-0f5c-45a0-81cb-673104c0f806-kube-api-access-4t269\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.701260 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7eaed1d2-efc5-4a55-a12d-9652e3343ff1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eaed1d2-efc5-4a55-a12d-9652e3343ff1\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.704740 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/7e05f45d-0f5c-45a0-81cb-673104c0f806-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.705050 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e05f45d-0f5c-45a0-81cb-673104c0f806-config\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.705506 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/7e05f45d-0f5c-45a0-81cb-673104c0f806-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.706148 4907 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.706168 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3cab4d62-f767-41f3-a510-53413927078a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3cab4d62-f767-41f3-a510-53413927078a\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2dfd13827daa5d7626870e72ffbe6211693f2eb4657547dec2b14162980c6a42/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.706614 4907 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.706630 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7eaed1d2-efc5-4a55-a12d-9652e3343ff1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eaed1d2-efc5-4a55-a12d-9652e3343ff1\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0cd374e77a4438fb72edc419690f625811dbb53c575f6298c4a4c875721c1eb9/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.706914 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/7e05f45d-0f5c-45a0-81cb-673104c0f806-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.710013 4907 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.710035 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8a914a46-0246-4e10-a8b3-91d7d55de36a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a914a46-0246-4e10-a8b3-91d7d55de36a\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1cb0a7e676c53edda50d9decc7894ae78da338dfe2bfeca7322a9cb76c537c03/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.711207 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.711845 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/1395c265-394b-4f9d-9bab-ebdff563a7b2-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.717991 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1395c265-394b-4f9d-9bab-ebdff563a7b2-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.718633 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.719387 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/1395c265-394b-4f9d-9bab-ebdff563a7b2-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.721249 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.721475 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.723458 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.732579 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/1395c265-394b-4f9d-9bab-ebdff563a7b2-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.748558 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msd74\" (UniqueName: \"kubernetes.io/projected/1395c265-394b-4f9d-9bab-ebdff563a7b2-kube-api-access-msd74\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.749329 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t269\" (UniqueName: \"kubernetes.io/projected/7e05f45d-0f5c-45a0-81cb-673104c0f806-kube-api-access-4t269\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.762478 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8a914a46-0246-4e10-a8b3-91d7d55de36a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a914a46-0246-4e10-a8b3-91d7d55de36a\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.770661 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3cab4d62-f767-41f3-a510-53413927078a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3cab4d62-f767-41f3-a510-53413927078a\") pod \"logging-loki-ingester-0\" (UID: \"7e05f45d-0f5c-45a0-81cb-673104c0f806\") " pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.771080 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-586bf9b9f5-7865w"] Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.779972 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7eaed1d2-efc5-4a55-a12d-9652e3343ff1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eaed1d2-efc5-4a55-a12d-9652e3343ff1\") pod \"logging-loki-compactor-0\" (UID: \"1395c265-394b-4f9d-9bab-ebdff563a7b2\") " pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.803322 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-config\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.803821 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.803882 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.803927 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.803956 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-27cea880-cf38-4d51-a88b-7033f29cbacf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27cea880-cf38-4d51-a88b-7033f29cbacf\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.803999 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.804024 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg69f\" (UniqueName: \"kubernetes.io/projected/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-kube-api-access-qg69f\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.804862 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.904520 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p"] Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.905813 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-config\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.905894 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.905960 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.906026 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.906063 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-27cea880-cf38-4d51-a88b-7033f29cbacf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27cea880-cf38-4d51-a88b-7033f29cbacf\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.906103 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.906140 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg69f\" (UniqueName: \"kubernetes.io/projected/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-kube-api-access-qg69f\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.913789 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.913999 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.916292 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-config\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.916643 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.917923 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: W1129 14:42:38.922638 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb80f35c_6683_4296_afd9_a0895e860a3d.slice/crio-99c81088920037c0a9bf52c4029da38aa2d89f9b1b718539d6b4230bd0f7397a WatchSource:0}: Error finding container 99c81088920037c0a9bf52c4029da38aa2d89f9b1b718539d6b4230bd0f7397a: Status 404 returned error can't find the container with id 99c81088920037c0a9bf52c4029da38aa2d89f9b1b718539d6b4230bd0f7397a Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.923400 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg69f\" (UniqueName: \"kubernetes.io/projected/d24d8eb0-2f0a-41f7-9234-2ae2bab4b191-kube-api-access-qg69f\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.924310 4907 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.924349 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-27cea880-cf38-4d51-a88b-7033f29cbacf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27cea880-cf38-4d51-a88b-7033f29cbacf\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c7ac0517f1d9aad554daa77e63df7213e92296e506239ea1c9f3d1f5810b6307/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:38 crc kubenswrapper[4907]: I1129 14:42:38.983554 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-27cea880-cf38-4d51-a88b-7033f29cbacf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27cea880-cf38-4d51-a88b-7033f29cbacf\") pod \"logging-loki-index-gateway-0\" (UID: \"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191\") " pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:39 crc kubenswrapper[4907]: I1129 14:42:39.026143 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Nov 29 14:42:39 crc kubenswrapper[4907]: I1129 14:42:39.059916 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:39 crc kubenswrapper[4907]: I1129 14:42:39.062341 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:39 crc kubenswrapper[4907]: I1129 14:42:39.371611 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Nov 29 14:42:39 crc kubenswrapper[4907]: W1129 14:42:39.383737 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd24d8eb0_2f0a_41f7_9234_2ae2bab4b191.slice/crio-f7374e45e0bbe7e1ca9225c69aa799b8ea2eead012eefae52705c957da91fbce WatchSource:0}: Error finding container f7374e45e0bbe7e1ca9225c69aa799b8ea2eead012eefae52705c957da91fbce: Status 404 returned error can't find the container with id f7374e45e0bbe7e1ca9225c69aa799b8ea2eead012eefae52705c957da91fbce Nov 29 14:42:39 crc kubenswrapper[4907]: I1129 14:42:39.521165 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" event={"ID":"53107106-32ab-4c46-949f-094abb62ce68","Type":"ContainerStarted","Data":"e6261b2592bb2ee5546ffb4703d3dcb0ca848ba5c0ad6e2472f5183fdbc72fea"} Nov 29 14:42:39 crc kubenswrapper[4907]: I1129 14:42:39.525187 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191","Type":"ContainerStarted","Data":"f7374e45e0bbe7e1ca9225c69aa799b8ea2eead012eefae52705c957da91fbce"} Nov 29 14:42:39 crc kubenswrapper[4907]: I1129 14:42:39.527688 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9t4r" event={"ID":"55d74aa1-9372-4d35-be6f-e26c62642bb8","Type":"ContainerStarted","Data":"d8cd7d76640273038a796010210314b01ccc35eabe5ea8eb5635468fc697640f"} Nov 29 14:42:39 crc kubenswrapper[4907]: I1129 14:42:39.529183 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"1395c265-394b-4f9d-9bab-ebdff563a7b2","Type":"ContainerStarted","Data":"d545cc15f383a7abe1beccf8dc7d7bb6a63cff9c9508971f42175c9298cc9411"} Nov 29 14:42:39 crc kubenswrapper[4907]: I1129 14:42:39.530249 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" event={"ID":"fb80f35c-6683-4296-afd9-a0895e860a3d","Type":"ContainerStarted","Data":"99c81088920037c0a9bf52c4029da38aa2d89f9b1b718539d6b4230bd0f7397a"} Nov 29 14:42:39 crc kubenswrapper[4907]: I1129 14:42:39.550615 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f9t4r" podStartSLOduration=3.007752152 podStartE2EDuration="4.550600674s" podCreationTimestamp="2025-11-29 14:42:35 +0000 UTC" firstStartedPulling="2025-11-29 14:42:37.423131751 +0000 UTC m=+855.409969403" lastFinishedPulling="2025-11-29 14:42:38.965980273 +0000 UTC m=+856.952817925" observedRunningTime="2025-11-29 14:42:39.549324618 +0000 UTC m=+857.536162280" watchObservedRunningTime="2025-11-29 14:42:39.550600674 +0000 UTC m=+857.537438326" Nov 29 14:42:39 crc kubenswrapper[4907]: I1129 14:42:39.616104 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Nov 29 14:42:39 crc kubenswrapper[4907]: W1129 14:42:39.637513 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e05f45d_0f5c_45a0_81cb_673104c0f806.slice/crio-fd2d524aa9cacc2c5d882fb2e5ed55609e102fdec5abedea9f29c45430e8dc53 WatchSource:0}: Error finding container fd2d524aa9cacc2c5d882fb2e5ed55609e102fdec5abedea9f29c45430e8dc53: Status 404 returned error can't find the container with id fd2d524aa9cacc2c5d882fb2e5ed55609e102fdec5abedea9f29c45430e8dc53 Nov 29 14:42:40 crc kubenswrapper[4907]: I1129 14:42:40.539295 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"7e05f45d-0f5c-45a0-81cb-673104c0f806","Type":"ContainerStarted","Data":"fd2d524aa9cacc2c5d882fb2e5ed55609e102fdec5abedea9f29c45430e8dc53"} Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.574314 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" event={"ID":"54d3a7f1-ba2a-4744-937a-4bf219bb85ab","Type":"ContainerStarted","Data":"d3101c7393ee36651191a177969eaa449827e24eb08a351d83cc8d03b0bc13cd"} Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.575037 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.576914 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"d24d8eb0-2f0a-41f7-9234-2ae2bab4b191","Type":"ContainerStarted","Data":"f401403d1db263c67911e265019d38d47ec0438f156d07f06b6773c77becddc6"} Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.577048 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.580688 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"1395c265-394b-4f9d-9bab-ebdff563a7b2","Type":"ContainerStarted","Data":"c35bc06e36d37a5a8afac4aaae83b0d68e0be0bab34275cf00545015bcb34203"} Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.580851 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.581921 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" event={"ID":"fb80f35c-6683-4296-afd9-a0895e860a3d","Type":"ContainerStarted","Data":"930ab197ae4ea5f405a8edaea3d1a659c750a4b6c3f3ae646ce6b578055027dd"} Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.583468 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" event={"ID":"34885590-0043-44b5-be42-f726d65f8487","Type":"ContainerStarted","Data":"7bb3a1dddbd7c64d854cc08c8543b499821d2a8a94434d998a8f7d43c0b1c495"} Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.583598 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.584977 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" event={"ID":"53bdaeef-1d57-48e5-8b2d-bc9edacd5351","Type":"ContainerStarted","Data":"8e5d4e75a96cdde3206c52f1557f502b300e435befee2fab35ac2c0a1dec3199"} Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.585064 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.586263 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"7e05f45d-0f5c-45a0-81cb-673104c0f806","Type":"ContainerStarted","Data":"fcc4935b4f4c9e74ca5fc0721a67366eafddf1c4209a43739c03b5027fbe3a68"} Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.586479 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.587503 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" event={"ID":"53107106-32ab-4c46-949f-094abb62ce68","Type":"ContainerStarted","Data":"e16cc8e3c5943191785fac41f56d0c2b7bb381b68450b60e2eae459f6d8c02db"} Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.599815 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" podStartSLOduration=1.8642638759999999 podStartE2EDuration="6.599795982s" podCreationTimestamp="2025-11-29 14:42:37 +0000 UTC" firstStartedPulling="2025-11-29 14:42:38.155622489 +0000 UTC m=+856.142460141" lastFinishedPulling="2025-11-29 14:42:42.891154595 +0000 UTC m=+860.877992247" observedRunningTime="2025-11-29 14:42:43.596417197 +0000 UTC m=+861.583254879" watchObservedRunningTime="2025-11-29 14:42:43.599795982 +0000 UTC m=+861.586633654" Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.628527 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=2.814078862 podStartE2EDuration="6.628506082s" podCreationTimestamp="2025-11-29 14:42:37 +0000 UTC" firstStartedPulling="2025-11-29 14:42:39.037223822 +0000 UTC m=+857.024061464" lastFinishedPulling="2025-11-29 14:42:42.851651032 +0000 UTC m=+860.838488684" observedRunningTime="2025-11-29 14:42:43.623166531 +0000 UTC m=+861.610004213" watchObservedRunningTime="2025-11-29 14:42:43.628506082 +0000 UTC m=+861.615343744" Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.655408 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.406994816 podStartE2EDuration="6.655380119s" podCreationTimestamp="2025-11-29 14:42:37 +0000 UTC" firstStartedPulling="2025-11-29 14:42:39.642428313 +0000 UTC m=+857.629265965" lastFinishedPulling="2025-11-29 14:42:42.890813626 +0000 UTC m=+860.877651268" observedRunningTime="2025-11-29 14:42:43.642962859 +0000 UTC m=+861.629800521" watchObservedRunningTime="2025-11-29 14:42:43.655380119 +0000 UTC m=+861.642217811" Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.662280 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" podStartSLOduration=1.9964341920000002 podStartE2EDuration="6.662262083s" podCreationTimestamp="2025-11-29 14:42:37 +0000 UTC" firstStartedPulling="2025-11-29 14:42:38.224331506 +0000 UTC m=+856.211169158" lastFinishedPulling="2025-11-29 14:42:42.890159397 +0000 UTC m=+860.876997049" observedRunningTime="2025-11-29 14:42:43.660843883 +0000 UTC m=+861.647681535" watchObservedRunningTime="2025-11-29 14:42:43.662262083 +0000 UTC m=+861.649099755" Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.685559 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.221366063 podStartE2EDuration="6.685532959s" podCreationTimestamp="2025-11-29 14:42:37 +0000 UTC" firstStartedPulling="2025-11-29 14:42:39.385781998 +0000 UTC m=+857.372619650" lastFinishedPulling="2025-11-29 14:42:42.849948894 +0000 UTC m=+860.836786546" observedRunningTime="2025-11-29 14:42:43.681198847 +0000 UTC m=+861.668036509" watchObservedRunningTime="2025-11-29 14:42:43.685532959 +0000 UTC m=+861.672370631" Nov 29 14:42:43 crc kubenswrapper[4907]: I1129 14:42:43.714013 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" podStartSLOduration=2.288752602 podStartE2EDuration="6.713993831s" podCreationTimestamp="2025-11-29 14:42:37 +0000 UTC" firstStartedPulling="2025-11-29 14:42:38.372682688 +0000 UTC m=+856.359520330" lastFinishedPulling="2025-11-29 14:42:42.797923907 +0000 UTC m=+860.784761559" observedRunningTime="2025-11-29 14:42:43.707853148 +0000 UTC m=+861.694690810" watchObservedRunningTime="2025-11-29 14:42:43.713993831 +0000 UTC m=+861.700831493" Nov 29 14:42:46 crc kubenswrapper[4907]: I1129 14:42:46.196307 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:46 crc kubenswrapper[4907]: I1129 14:42:46.197003 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:46 crc kubenswrapper[4907]: I1129 14:42:46.242703 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:46 crc kubenswrapper[4907]: I1129 14:42:46.614986 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" event={"ID":"fb80f35c-6683-4296-afd9-a0895e860a3d","Type":"ContainerStarted","Data":"43653bcffcc29b15100d762b9119a590acf4a445e3f964cdc66c9d9c937aa7a4"} Nov 29 14:42:46 crc kubenswrapper[4907]: I1129 14:42:46.615634 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:46 crc kubenswrapper[4907]: I1129 14:42:46.619299 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" event={"ID":"53107106-32ab-4c46-949f-094abb62ce68","Type":"ContainerStarted","Data":"606817595ebd473003a46835e88938e7c7a7e74413e309eba57d53d6c4f873cb"} Nov 29 14:42:46 crc kubenswrapper[4907]: I1129 14:42:46.620115 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:46 crc kubenswrapper[4907]: I1129 14:42:46.620313 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:46 crc kubenswrapper[4907]: I1129 14:42:46.635530 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:46 crc kubenswrapper[4907]: I1129 14:42:46.639077 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:46 crc kubenswrapper[4907]: I1129 14:42:46.651642 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" Nov 29 14:42:46 crc kubenswrapper[4907]: I1129 14:42:46.674965 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" podStartSLOduration=2.707842607 podStartE2EDuration="9.674934301s" podCreationTimestamp="2025-11-29 14:42:37 +0000 UTC" firstStartedPulling="2025-11-29 14:42:38.929071423 +0000 UTC m=+856.915909075" lastFinishedPulling="2025-11-29 14:42:45.896163087 +0000 UTC m=+863.883000769" observedRunningTime="2025-11-29 14:42:46.65715595 +0000 UTC m=+864.643993642" watchObservedRunningTime="2025-11-29 14:42:46.674934301 +0000 UTC m=+864.661771993" Nov 29 14:42:46 crc kubenswrapper[4907]: I1129 14:42:46.706709 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:46 crc kubenswrapper[4907]: I1129 14:42:46.734212 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-7865w" podStartSLOduration=2.5915967699999998 podStartE2EDuration="9.734173371s" podCreationTimestamp="2025-11-29 14:42:37 +0000 UTC" firstStartedPulling="2025-11-29 14:42:38.782851501 +0000 UTC m=+856.769689153" lastFinishedPulling="2025-11-29 14:42:45.925428092 +0000 UTC m=+863.912265754" observedRunningTime="2025-11-29 14:42:46.730138857 +0000 UTC m=+864.716976569" watchObservedRunningTime="2025-11-29 14:42:46.734173371 +0000 UTC m=+864.721011053" Nov 29 14:42:46 crc kubenswrapper[4907]: I1129 14:42:46.799415 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9t4r"] Nov 29 14:42:47 crc kubenswrapper[4907]: I1129 14:42:47.630488 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:47 crc kubenswrapper[4907]: I1129 14:42:47.639530 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-586bf9b9f5-wlw8p" Nov 29 14:42:48 crc kubenswrapper[4907]: I1129 14:42:48.637563 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f9t4r" podUID="55d74aa1-9372-4d35-be6f-e26c62642bb8" containerName="registry-server" containerID="cri-o://d8cd7d76640273038a796010210314b01ccc35eabe5ea8eb5635468fc697640f" gracePeriod=2 Nov 29 14:42:50 crc kubenswrapper[4907]: I1129 14:42:50.660705 4907 generic.go:334] "Generic (PLEG): container finished" podID="55d74aa1-9372-4d35-be6f-e26c62642bb8" containerID="d8cd7d76640273038a796010210314b01ccc35eabe5ea8eb5635468fc697640f" exitCode=0 Nov 29 14:42:50 crc kubenswrapper[4907]: I1129 14:42:50.660781 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9t4r" event={"ID":"55d74aa1-9372-4d35-be6f-e26c62642bb8","Type":"ContainerDied","Data":"d8cd7d76640273038a796010210314b01ccc35eabe5ea8eb5635468fc697640f"} Nov 29 14:42:51 crc kubenswrapper[4907]: I1129 14:42:51.600012 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:51 crc kubenswrapper[4907]: I1129 14:42:51.673796 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9t4r" event={"ID":"55d74aa1-9372-4d35-be6f-e26c62642bb8","Type":"ContainerDied","Data":"aa3e8badd69f8b880f600aa52dbca0abf9696a35f83d0820cd859574aefdd40c"} Nov 29 14:42:51 crc kubenswrapper[4907]: I1129 14:42:51.673885 4907 scope.go:117] "RemoveContainer" containerID="d8cd7d76640273038a796010210314b01ccc35eabe5ea8eb5635468fc697640f" Nov 29 14:42:51 crc kubenswrapper[4907]: I1129 14:42:51.673918 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9t4r" Nov 29 14:42:51 crc kubenswrapper[4907]: I1129 14:42:51.678897 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d74aa1-9372-4d35-be6f-e26c62642bb8-utilities\") pod \"55d74aa1-9372-4d35-be6f-e26c62642bb8\" (UID: \"55d74aa1-9372-4d35-be6f-e26c62642bb8\") " Nov 29 14:42:51 crc kubenswrapper[4907]: I1129 14:42:51.679136 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d74aa1-9372-4d35-be6f-e26c62642bb8-catalog-content\") pod \"55d74aa1-9372-4d35-be6f-e26c62642bb8\" (UID: \"55d74aa1-9372-4d35-be6f-e26c62642bb8\") " Nov 29 14:42:51 crc kubenswrapper[4907]: I1129 14:42:51.679183 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6vcj\" (UniqueName: \"kubernetes.io/projected/55d74aa1-9372-4d35-be6f-e26c62642bb8-kube-api-access-g6vcj\") pod \"55d74aa1-9372-4d35-be6f-e26c62642bb8\" (UID: \"55d74aa1-9372-4d35-be6f-e26c62642bb8\") " Nov 29 14:42:51 crc kubenswrapper[4907]: I1129 14:42:51.684023 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55d74aa1-9372-4d35-be6f-e26c62642bb8-utilities" (OuterVolumeSpecName: "utilities") pod "55d74aa1-9372-4d35-be6f-e26c62642bb8" (UID: "55d74aa1-9372-4d35-be6f-e26c62642bb8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:42:51 crc kubenswrapper[4907]: I1129 14:42:51.689982 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55d74aa1-9372-4d35-be6f-e26c62642bb8-kube-api-access-g6vcj" (OuterVolumeSpecName: "kube-api-access-g6vcj") pod "55d74aa1-9372-4d35-be6f-e26c62642bb8" (UID: "55d74aa1-9372-4d35-be6f-e26c62642bb8"). InnerVolumeSpecName "kube-api-access-g6vcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:42:51 crc kubenswrapper[4907]: I1129 14:42:51.700989 4907 scope.go:117] "RemoveContainer" containerID="443705100ec156ce5ea21d0835922649124fe2066372e7853bbbbf80f11c4a0f" Nov 29 14:42:51 crc kubenswrapper[4907]: I1129 14:42:51.709665 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55d74aa1-9372-4d35-be6f-e26c62642bb8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55d74aa1-9372-4d35-be6f-e26c62642bb8" (UID: "55d74aa1-9372-4d35-be6f-e26c62642bb8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:42:51 crc kubenswrapper[4907]: I1129 14:42:51.732666 4907 scope.go:117] "RemoveContainer" containerID="91eb8a3e8bf24d6d17b9db5c904c77f83a0403db94c53b06ebddc4aea9bf365b" Nov 29 14:42:51 crc kubenswrapper[4907]: I1129 14:42:51.781935 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d74aa1-9372-4d35-be6f-e26c62642bb8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:42:51 crc kubenswrapper[4907]: I1129 14:42:51.781976 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6vcj\" (UniqueName: \"kubernetes.io/projected/55d74aa1-9372-4d35-be6f-e26c62642bb8-kube-api-access-g6vcj\") on node \"crc\" DevicePath \"\"" Nov 29 14:42:51 crc kubenswrapper[4907]: I1129 14:42:51.781989 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d74aa1-9372-4d35-be6f-e26c62642bb8-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:42:52 crc kubenswrapper[4907]: I1129 14:42:52.024949 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9t4r"] Nov 29 14:42:52 crc kubenswrapper[4907]: I1129 14:42:52.033738 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9t4r"] Nov 29 14:42:52 crc kubenswrapper[4907]: I1129 14:42:52.491630 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55d74aa1-9372-4d35-be6f-e26c62642bb8" path="/var/lib/kubelet/pods/55d74aa1-9372-4d35-be6f-e26c62642bb8/volumes" Nov 29 14:42:58 crc kubenswrapper[4907]: I1129 14:42:58.815132 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Nov 29 14:42:59 crc kubenswrapper[4907]: I1129 14:42:59.071020 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Nov 29 14:42:59 crc kubenswrapper[4907]: I1129 14:42:59.077141 4907 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Nov 29 14:42:59 crc kubenswrapper[4907]: I1129 14:42:59.077203 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="7e05f45d-0f5c-45a0-81cb-673104c0f806" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 29 14:43:07 crc kubenswrapper[4907]: I1129 14:43:07.596054 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-76cc67bf56-fxc7c" Nov 29 14:43:07 crc kubenswrapper[4907]: I1129 14:43:07.703906 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-5895d59bb8-dmfpd" Nov 29 14:43:07 crc kubenswrapper[4907]: I1129 14:43:07.926280 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xpnnl" Nov 29 14:43:09 crc kubenswrapper[4907]: I1129 14:43:09.074284 4907 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Nov 29 14:43:09 crc kubenswrapper[4907]: I1129 14:43:09.074852 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="7e05f45d-0f5c-45a0-81cb-673104c0f806" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.706836 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8rclt"] Nov 29 14:43:10 crc kubenswrapper[4907]: E1129 14:43:10.707582 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d74aa1-9372-4d35-be6f-e26c62642bb8" containerName="extract-utilities" Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.707601 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d74aa1-9372-4d35-be6f-e26c62642bb8" containerName="extract-utilities" Nov 29 14:43:10 crc kubenswrapper[4907]: E1129 14:43:10.707645 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d74aa1-9372-4d35-be6f-e26c62642bb8" containerName="registry-server" Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.707654 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d74aa1-9372-4d35-be6f-e26c62642bb8" containerName="registry-server" Nov 29 14:43:10 crc kubenswrapper[4907]: E1129 14:43:10.707671 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d74aa1-9372-4d35-be6f-e26c62642bb8" containerName="extract-content" Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.707680 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d74aa1-9372-4d35-be6f-e26c62642bb8" containerName="extract-content" Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.708075 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d74aa1-9372-4d35-be6f-e26c62642bb8" containerName="registry-server" Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.712872 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.745373 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rclt"] Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.746065 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29rb9\" (UniqueName: \"kubernetes.io/projected/eb9761d8-7ba0-4325-8554-10dc66fdc965-kube-api-access-29rb9\") pod \"certified-operators-8rclt\" (UID: \"eb9761d8-7ba0-4325-8554-10dc66fdc965\") " pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.746252 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb9761d8-7ba0-4325-8554-10dc66fdc965-catalog-content\") pod \"certified-operators-8rclt\" (UID: \"eb9761d8-7ba0-4325-8554-10dc66fdc965\") " pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.746352 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb9761d8-7ba0-4325-8554-10dc66fdc965-utilities\") pod \"certified-operators-8rclt\" (UID: \"eb9761d8-7ba0-4325-8554-10dc66fdc965\") " pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.847664 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29rb9\" (UniqueName: \"kubernetes.io/projected/eb9761d8-7ba0-4325-8554-10dc66fdc965-kube-api-access-29rb9\") pod \"certified-operators-8rclt\" (UID: \"eb9761d8-7ba0-4325-8554-10dc66fdc965\") " pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.847951 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb9761d8-7ba0-4325-8554-10dc66fdc965-catalog-content\") pod \"certified-operators-8rclt\" (UID: \"eb9761d8-7ba0-4325-8554-10dc66fdc965\") " pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.848076 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb9761d8-7ba0-4325-8554-10dc66fdc965-utilities\") pod \"certified-operators-8rclt\" (UID: \"eb9761d8-7ba0-4325-8554-10dc66fdc965\") " pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.848663 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb9761d8-7ba0-4325-8554-10dc66fdc965-catalog-content\") pod \"certified-operators-8rclt\" (UID: \"eb9761d8-7ba0-4325-8554-10dc66fdc965\") " pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.848791 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb9761d8-7ba0-4325-8554-10dc66fdc965-utilities\") pod \"certified-operators-8rclt\" (UID: \"eb9761d8-7ba0-4325-8554-10dc66fdc965\") " pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:10 crc kubenswrapper[4907]: I1129 14:43:10.887722 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29rb9\" (UniqueName: \"kubernetes.io/projected/eb9761d8-7ba0-4325-8554-10dc66fdc965-kube-api-access-29rb9\") pod \"certified-operators-8rclt\" (UID: \"eb9761d8-7ba0-4325-8554-10dc66fdc965\") " pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:11 crc kubenswrapper[4907]: I1129 14:43:11.063105 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:11 crc kubenswrapper[4907]: I1129 14:43:11.370318 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rclt"] Nov 29 14:43:11 crc kubenswrapper[4907]: I1129 14:43:11.840862 4907 generic.go:334] "Generic (PLEG): container finished" podID="eb9761d8-7ba0-4325-8554-10dc66fdc965" containerID="4b2b70f3a2e98ddd41ea6e9c52021b0f9829c4c30348ce8315b01fe67f01e229" exitCode=0 Nov 29 14:43:11 crc kubenswrapper[4907]: I1129 14:43:11.841155 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rclt" event={"ID":"eb9761d8-7ba0-4325-8554-10dc66fdc965","Type":"ContainerDied","Data":"4b2b70f3a2e98ddd41ea6e9c52021b0f9829c4c30348ce8315b01fe67f01e229"} Nov 29 14:43:11 crc kubenswrapper[4907]: I1129 14:43:11.841189 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rclt" event={"ID":"eb9761d8-7ba0-4325-8554-10dc66fdc965","Type":"ContainerStarted","Data":"a2b278cf60fb732efa7e30c7aa75e8708f0a3167b5247b2f3133917beede47a6"} Nov 29 14:43:12 crc kubenswrapper[4907]: I1129 14:43:12.849217 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rclt" event={"ID":"eb9761d8-7ba0-4325-8554-10dc66fdc965","Type":"ContainerStarted","Data":"17e6ed7bbe204dc1d4ff16bc4da4aa7bf47ff0f0ca8c178c704802ed4f74babf"} Nov 29 14:43:13 crc kubenswrapper[4907]: I1129 14:43:13.874862 4907 generic.go:334] "Generic (PLEG): container finished" podID="eb9761d8-7ba0-4325-8554-10dc66fdc965" containerID="17e6ed7bbe204dc1d4ff16bc4da4aa7bf47ff0f0ca8c178c704802ed4f74babf" exitCode=0 Nov 29 14:43:13 crc kubenswrapper[4907]: I1129 14:43:13.874946 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rclt" event={"ID":"eb9761d8-7ba0-4325-8554-10dc66fdc965","Type":"ContainerDied","Data":"17e6ed7bbe204dc1d4ff16bc4da4aa7bf47ff0f0ca8c178c704802ed4f74babf"} Nov 29 14:43:14 crc kubenswrapper[4907]: I1129 14:43:14.886959 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rclt" event={"ID":"eb9761d8-7ba0-4325-8554-10dc66fdc965","Type":"ContainerStarted","Data":"b6f548140bfaf83ebc73b8ef16ac0fbbfd205a36e5fec7d9435191a2e0381a1b"} Nov 29 14:43:14 crc kubenswrapper[4907]: I1129 14:43:14.915351 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8rclt" podStartSLOduration=2.464459834 podStartE2EDuration="4.915328655s" podCreationTimestamp="2025-11-29 14:43:10 +0000 UTC" firstStartedPulling="2025-11-29 14:43:11.844037104 +0000 UTC m=+889.830874746" lastFinishedPulling="2025-11-29 14:43:14.294905875 +0000 UTC m=+892.281743567" observedRunningTime="2025-11-29 14:43:14.908931884 +0000 UTC m=+892.895769546" watchObservedRunningTime="2025-11-29 14:43:14.915328655 +0000 UTC m=+892.902166337" Nov 29 14:43:19 crc kubenswrapper[4907]: I1129 14:43:19.072913 4907 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Nov 29 14:43:19 crc kubenswrapper[4907]: I1129 14:43:19.073369 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="7e05f45d-0f5c-45a0-81cb-673104c0f806" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 29 14:43:21 crc kubenswrapper[4907]: I1129 14:43:21.063775 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:21 crc kubenswrapper[4907]: I1129 14:43:21.065310 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:21 crc kubenswrapper[4907]: I1129 14:43:21.128953 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:22 crc kubenswrapper[4907]: I1129 14:43:22.019888 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:22 crc kubenswrapper[4907]: I1129 14:43:22.082520 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rclt"] Nov 29 14:43:23 crc kubenswrapper[4907]: I1129 14:43:23.967698 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8rclt" podUID="eb9761d8-7ba0-4325-8554-10dc66fdc965" containerName="registry-server" containerID="cri-o://b6f548140bfaf83ebc73b8ef16ac0fbbfd205a36e5fec7d9435191a2e0381a1b" gracePeriod=2 Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.436594 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.537917 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb9761d8-7ba0-4325-8554-10dc66fdc965-catalog-content\") pod \"eb9761d8-7ba0-4325-8554-10dc66fdc965\" (UID: \"eb9761d8-7ba0-4325-8554-10dc66fdc965\") " Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.538030 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb9761d8-7ba0-4325-8554-10dc66fdc965-utilities\") pod \"eb9761d8-7ba0-4325-8554-10dc66fdc965\" (UID: \"eb9761d8-7ba0-4325-8554-10dc66fdc965\") " Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.538126 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29rb9\" (UniqueName: \"kubernetes.io/projected/eb9761d8-7ba0-4325-8554-10dc66fdc965-kube-api-access-29rb9\") pod \"eb9761d8-7ba0-4325-8554-10dc66fdc965\" (UID: \"eb9761d8-7ba0-4325-8554-10dc66fdc965\") " Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.538936 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb9761d8-7ba0-4325-8554-10dc66fdc965-utilities" (OuterVolumeSpecName: "utilities") pod "eb9761d8-7ba0-4325-8554-10dc66fdc965" (UID: "eb9761d8-7ba0-4325-8554-10dc66fdc965"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.539240 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb9761d8-7ba0-4325-8554-10dc66fdc965-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.548761 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb9761d8-7ba0-4325-8554-10dc66fdc965-kube-api-access-29rb9" (OuterVolumeSpecName: "kube-api-access-29rb9") pod "eb9761d8-7ba0-4325-8554-10dc66fdc965" (UID: "eb9761d8-7ba0-4325-8554-10dc66fdc965"). InnerVolumeSpecName "kube-api-access-29rb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.602899 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb9761d8-7ba0-4325-8554-10dc66fdc965-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb9761d8-7ba0-4325-8554-10dc66fdc965" (UID: "eb9761d8-7ba0-4325-8554-10dc66fdc965"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.640848 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb9761d8-7ba0-4325-8554-10dc66fdc965-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.640880 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29rb9\" (UniqueName: \"kubernetes.io/projected/eb9761d8-7ba0-4325-8554-10dc66fdc965-kube-api-access-29rb9\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.978210 4907 generic.go:334] "Generic (PLEG): container finished" podID="eb9761d8-7ba0-4325-8554-10dc66fdc965" containerID="b6f548140bfaf83ebc73b8ef16ac0fbbfd205a36e5fec7d9435191a2e0381a1b" exitCode=0 Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.978269 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rclt" event={"ID":"eb9761d8-7ba0-4325-8554-10dc66fdc965","Type":"ContainerDied","Data":"b6f548140bfaf83ebc73b8ef16ac0fbbfd205a36e5fec7d9435191a2e0381a1b"} Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.978286 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rclt" Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.978308 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rclt" event={"ID":"eb9761d8-7ba0-4325-8554-10dc66fdc965","Type":"ContainerDied","Data":"a2b278cf60fb732efa7e30c7aa75e8708f0a3167b5247b2f3133917beede47a6"} Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.978335 4907 scope.go:117] "RemoveContainer" containerID="b6f548140bfaf83ebc73b8ef16ac0fbbfd205a36e5fec7d9435191a2e0381a1b" Nov 29 14:43:24 crc kubenswrapper[4907]: I1129 14:43:24.997021 4907 scope.go:117] "RemoveContainer" containerID="17e6ed7bbe204dc1d4ff16bc4da4aa7bf47ff0f0ca8c178c704802ed4f74babf" Nov 29 14:43:25 crc kubenswrapper[4907]: I1129 14:43:25.023054 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rclt"] Nov 29 14:43:25 crc kubenswrapper[4907]: I1129 14:43:25.030757 4907 scope.go:117] "RemoveContainer" containerID="4b2b70f3a2e98ddd41ea6e9c52021b0f9829c4c30348ce8315b01fe67f01e229" Nov 29 14:43:25 crc kubenswrapper[4907]: I1129 14:43:25.030771 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8rclt"] Nov 29 14:43:25 crc kubenswrapper[4907]: I1129 14:43:25.091850 4907 scope.go:117] "RemoveContainer" containerID="b6f548140bfaf83ebc73b8ef16ac0fbbfd205a36e5fec7d9435191a2e0381a1b" Nov 29 14:43:25 crc kubenswrapper[4907]: E1129 14:43:25.092324 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6f548140bfaf83ebc73b8ef16ac0fbbfd205a36e5fec7d9435191a2e0381a1b\": container with ID starting with b6f548140bfaf83ebc73b8ef16ac0fbbfd205a36e5fec7d9435191a2e0381a1b not found: ID does not exist" containerID="b6f548140bfaf83ebc73b8ef16ac0fbbfd205a36e5fec7d9435191a2e0381a1b" Nov 29 14:43:25 crc kubenswrapper[4907]: I1129 14:43:25.092371 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6f548140bfaf83ebc73b8ef16ac0fbbfd205a36e5fec7d9435191a2e0381a1b"} err="failed to get container status \"b6f548140bfaf83ebc73b8ef16ac0fbbfd205a36e5fec7d9435191a2e0381a1b\": rpc error: code = NotFound desc = could not find container \"b6f548140bfaf83ebc73b8ef16ac0fbbfd205a36e5fec7d9435191a2e0381a1b\": container with ID starting with b6f548140bfaf83ebc73b8ef16ac0fbbfd205a36e5fec7d9435191a2e0381a1b not found: ID does not exist" Nov 29 14:43:25 crc kubenswrapper[4907]: I1129 14:43:25.092404 4907 scope.go:117] "RemoveContainer" containerID="17e6ed7bbe204dc1d4ff16bc4da4aa7bf47ff0f0ca8c178c704802ed4f74babf" Nov 29 14:43:25 crc kubenswrapper[4907]: E1129 14:43:25.092877 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17e6ed7bbe204dc1d4ff16bc4da4aa7bf47ff0f0ca8c178c704802ed4f74babf\": container with ID starting with 17e6ed7bbe204dc1d4ff16bc4da4aa7bf47ff0f0ca8c178c704802ed4f74babf not found: ID does not exist" containerID="17e6ed7bbe204dc1d4ff16bc4da4aa7bf47ff0f0ca8c178c704802ed4f74babf" Nov 29 14:43:25 crc kubenswrapper[4907]: I1129 14:43:25.092939 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e6ed7bbe204dc1d4ff16bc4da4aa7bf47ff0f0ca8c178c704802ed4f74babf"} err="failed to get container status \"17e6ed7bbe204dc1d4ff16bc4da4aa7bf47ff0f0ca8c178c704802ed4f74babf\": rpc error: code = NotFound desc = could not find container \"17e6ed7bbe204dc1d4ff16bc4da4aa7bf47ff0f0ca8c178c704802ed4f74babf\": container with ID starting with 17e6ed7bbe204dc1d4ff16bc4da4aa7bf47ff0f0ca8c178c704802ed4f74babf not found: ID does not exist" Nov 29 14:43:25 crc kubenswrapper[4907]: I1129 14:43:25.092972 4907 scope.go:117] "RemoveContainer" containerID="4b2b70f3a2e98ddd41ea6e9c52021b0f9829c4c30348ce8315b01fe67f01e229" Nov 29 14:43:25 crc kubenswrapper[4907]: E1129 14:43:25.093358 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b2b70f3a2e98ddd41ea6e9c52021b0f9829c4c30348ce8315b01fe67f01e229\": container with ID starting with 4b2b70f3a2e98ddd41ea6e9c52021b0f9829c4c30348ce8315b01fe67f01e229 not found: ID does not exist" containerID="4b2b70f3a2e98ddd41ea6e9c52021b0f9829c4c30348ce8315b01fe67f01e229" Nov 29 14:43:25 crc kubenswrapper[4907]: I1129 14:43:25.093392 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b2b70f3a2e98ddd41ea6e9c52021b0f9829c4c30348ce8315b01fe67f01e229"} err="failed to get container status \"4b2b70f3a2e98ddd41ea6e9c52021b0f9829c4c30348ce8315b01fe67f01e229\": rpc error: code = NotFound desc = could not find container \"4b2b70f3a2e98ddd41ea6e9c52021b0f9829c4c30348ce8315b01fe67f01e229\": container with ID starting with 4b2b70f3a2e98ddd41ea6e9c52021b0f9829c4c30348ce8315b01fe67f01e229 not found: ID does not exist" Nov 29 14:43:26 crc kubenswrapper[4907]: I1129 14:43:26.496047 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb9761d8-7ba0-4325-8554-10dc66fdc965" path="/var/lib/kubelet/pods/eb9761d8-7ba0-4325-8554-10dc66fdc965/volumes" Nov 29 14:43:29 crc kubenswrapper[4907]: I1129 14:43:29.070186 4907 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Nov 29 14:43:29 crc kubenswrapper[4907]: I1129 14:43:29.070667 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="7e05f45d-0f5c-45a0-81cb-673104c0f806" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 29 14:43:33 crc kubenswrapper[4907]: I1129 14:43:33.986206 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-24bfr"] Nov 29 14:43:33 crc kubenswrapper[4907]: E1129 14:43:33.987366 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9761d8-7ba0-4325-8554-10dc66fdc965" containerName="extract-utilities" Nov 29 14:43:33 crc kubenswrapper[4907]: I1129 14:43:33.987394 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9761d8-7ba0-4325-8554-10dc66fdc965" containerName="extract-utilities" Nov 29 14:43:33 crc kubenswrapper[4907]: E1129 14:43:33.987501 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9761d8-7ba0-4325-8554-10dc66fdc965" containerName="registry-server" Nov 29 14:43:33 crc kubenswrapper[4907]: I1129 14:43:33.987521 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9761d8-7ba0-4325-8554-10dc66fdc965" containerName="registry-server" Nov 29 14:43:33 crc kubenswrapper[4907]: E1129 14:43:33.987559 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9761d8-7ba0-4325-8554-10dc66fdc965" containerName="extract-content" Nov 29 14:43:33 crc kubenswrapper[4907]: I1129 14:43:33.987576 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9761d8-7ba0-4325-8554-10dc66fdc965" containerName="extract-content" Nov 29 14:43:33 crc kubenswrapper[4907]: I1129 14:43:33.987801 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb9761d8-7ba0-4325-8554-10dc66fdc965" containerName="registry-server" Nov 29 14:43:33 crc kubenswrapper[4907]: I1129 14:43:33.989690 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:34 crc kubenswrapper[4907]: I1129 14:43:34.018728 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncz8j\" (UniqueName: \"kubernetes.io/projected/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-kube-api-access-ncz8j\") pod \"community-operators-24bfr\" (UID: \"0b969ed7-9c77-4286-94c0-f7a0e7f3585e\") " pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:34 crc kubenswrapper[4907]: I1129 14:43:34.018878 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-utilities\") pod \"community-operators-24bfr\" (UID: \"0b969ed7-9c77-4286-94c0-f7a0e7f3585e\") " pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:34 crc kubenswrapper[4907]: I1129 14:43:34.018936 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-catalog-content\") pod \"community-operators-24bfr\" (UID: \"0b969ed7-9c77-4286-94c0-f7a0e7f3585e\") " pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:34 crc kubenswrapper[4907]: I1129 14:43:34.025685 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-24bfr"] Nov 29 14:43:34 crc kubenswrapper[4907]: I1129 14:43:34.120318 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-catalog-content\") pod \"community-operators-24bfr\" (UID: \"0b969ed7-9c77-4286-94c0-f7a0e7f3585e\") " pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:34 crc kubenswrapper[4907]: I1129 14:43:34.120488 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncz8j\" (UniqueName: \"kubernetes.io/projected/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-kube-api-access-ncz8j\") pod \"community-operators-24bfr\" (UID: \"0b969ed7-9c77-4286-94c0-f7a0e7f3585e\") " pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:34 crc kubenswrapper[4907]: I1129 14:43:34.120548 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-utilities\") pod \"community-operators-24bfr\" (UID: \"0b969ed7-9c77-4286-94c0-f7a0e7f3585e\") " pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:34 crc kubenswrapper[4907]: I1129 14:43:34.120792 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-catalog-content\") pod \"community-operators-24bfr\" (UID: \"0b969ed7-9c77-4286-94c0-f7a0e7f3585e\") " pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:34 crc kubenswrapper[4907]: I1129 14:43:34.121191 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-utilities\") pod \"community-operators-24bfr\" (UID: \"0b969ed7-9c77-4286-94c0-f7a0e7f3585e\") " pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:34 crc kubenswrapper[4907]: I1129 14:43:34.142884 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncz8j\" (UniqueName: \"kubernetes.io/projected/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-kube-api-access-ncz8j\") pod \"community-operators-24bfr\" (UID: \"0b969ed7-9c77-4286-94c0-f7a0e7f3585e\") " pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:34 crc kubenswrapper[4907]: I1129 14:43:34.316216 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:34 crc kubenswrapper[4907]: I1129 14:43:34.796142 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-24bfr"] Nov 29 14:43:35 crc kubenswrapper[4907]: I1129 14:43:35.082271 4907 generic.go:334] "Generic (PLEG): container finished" podID="0b969ed7-9c77-4286-94c0-f7a0e7f3585e" containerID="d140931f14bbb63bad2ed5a970ce2a06600e03438ee51b14659dcce651cc865b" exitCode=0 Nov 29 14:43:35 crc kubenswrapper[4907]: I1129 14:43:35.082497 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24bfr" event={"ID":"0b969ed7-9c77-4286-94c0-f7a0e7f3585e","Type":"ContainerDied","Data":"d140931f14bbb63bad2ed5a970ce2a06600e03438ee51b14659dcce651cc865b"} Nov 29 14:43:35 crc kubenswrapper[4907]: I1129 14:43:35.082619 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24bfr" event={"ID":"0b969ed7-9c77-4286-94c0-f7a0e7f3585e","Type":"ContainerStarted","Data":"e2b4457a6f9ad14f7021f9f18715a5c19df7790a64a2fc6f3dbd6850b356b0b0"} Nov 29 14:43:36 crc kubenswrapper[4907]: I1129 14:43:36.090582 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24bfr" event={"ID":"0b969ed7-9c77-4286-94c0-f7a0e7f3585e","Type":"ContainerStarted","Data":"3ce09f653308da0fb0ce5e6d6b55698a603e93af1a4bfdcf8244340009d58c02"} Nov 29 14:43:37 crc kubenswrapper[4907]: I1129 14:43:37.104262 4907 generic.go:334] "Generic (PLEG): container finished" podID="0b969ed7-9c77-4286-94c0-f7a0e7f3585e" containerID="3ce09f653308da0fb0ce5e6d6b55698a603e93af1a4bfdcf8244340009d58c02" exitCode=0 Nov 29 14:43:37 crc kubenswrapper[4907]: I1129 14:43:37.104352 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24bfr" event={"ID":"0b969ed7-9c77-4286-94c0-f7a0e7f3585e","Type":"ContainerDied","Data":"3ce09f653308da0fb0ce5e6d6b55698a603e93af1a4bfdcf8244340009d58c02"} Nov 29 14:43:38 crc kubenswrapper[4907]: I1129 14:43:38.115354 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24bfr" event={"ID":"0b969ed7-9c77-4286-94c0-f7a0e7f3585e","Type":"ContainerStarted","Data":"584b31c3a7973ec13582852b03599c0123a7c146d6d92abae74959c2b482c062"} Nov 29 14:43:38 crc kubenswrapper[4907]: I1129 14:43:38.140105 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-24bfr" podStartSLOduration=2.591839551 podStartE2EDuration="5.140084157s" podCreationTimestamp="2025-11-29 14:43:33 +0000 UTC" firstStartedPulling="2025-11-29 14:43:35.085122997 +0000 UTC m=+913.071960659" lastFinishedPulling="2025-11-29 14:43:37.633367583 +0000 UTC m=+915.620205265" observedRunningTime="2025-11-29 14:43:38.137022891 +0000 UTC m=+916.123860553" watchObservedRunningTime="2025-11-29 14:43:38.140084157 +0000 UTC m=+916.126921819" Nov 29 14:43:39 crc kubenswrapper[4907]: I1129 14:43:39.079864 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Nov 29 14:43:44 crc kubenswrapper[4907]: I1129 14:43:44.317308 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:44 crc kubenswrapper[4907]: I1129 14:43:44.318054 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:44 crc kubenswrapper[4907]: I1129 14:43:44.391351 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:45 crc kubenswrapper[4907]: I1129 14:43:45.254004 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:45 crc kubenswrapper[4907]: I1129 14:43:45.324615 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-24bfr"] Nov 29 14:43:47 crc kubenswrapper[4907]: I1129 14:43:47.191200 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-24bfr" podUID="0b969ed7-9c77-4286-94c0-f7a0e7f3585e" containerName="registry-server" containerID="cri-o://584b31c3a7973ec13582852b03599c0123a7c146d6d92abae74959c2b482c062" gracePeriod=2 Nov 29 14:43:49 crc kubenswrapper[4907]: I1129 14:43:49.205971 4907 generic.go:334] "Generic (PLEG): container finished" podID="0b969ed7-9c77-4286-94c0-f7a0e7f3585e" containerID="584b31c3a7973ec13582852b03599c0123a7c146d6d92abae74959c2b482c062" exitCode=0 Nov 29 14:43:49 crc kubenswrapper[4907]: I1129 14:43:49.206035 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24bfr" event={"ID":"0b969ed7-9c77-4286-94c0-f7a0e7f3585e","Type":"ContainerDied","Data":"584b31c3a7973ec13582852b03599c0123a7c146d6d92abae74959c2b482c062"} Nov 29 14:43:49 crc kubenswrapper[4907]: I1129 14:43:49.387625 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:49 crc kubenswrapper[4907]: I1129 14:43:49.493381 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-utilities\") pod \"0b969ed7-9c77-4286-94c0-f7a0e7f3585e\" (UID: \"0b969ed7-9c77-4286-94c0-f7a0e7f3585e\") " Nov 29 14:43:49 crc kubenswrapper[4907]: I1129 14:43:49.493461 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-catalog-content\") pod \"0b969ed7-9c77-4286-94c0-f7a0e7f3585e\" (UID: \"0b969ed7-9c77-4286-94c0-f7a0e7f3585e\") " Nov 29 14:43:49 crc kubenswrapper[4907]: I1129 14:43:49.493526 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncz8j\" (UniqueName: \"kubernetes.io/projected/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-kube-api-access-ncz8j\") pod \"0b969ed7-9c77-4286-94c0-f7a0e7f3585e\" (UID: \"0b969ed7-9c77-4286-94c0-f7a0e7f3585e\") " Nov 29 14:43:49 crc kubenswrapper[4907]: I1129 14:43:49.494345 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-utilities" (OuterVolumeSpecName: "utilities") pod "0b969ed7-9c77-4286-94c0-f7a0e7f3585e" (UID: "0b969ed7-9c77-4286-94c0-f7a0e7f3585e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:43:49 crc kubenswrapper[4907]: I1129 14:43:49.501653 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-kube-api-access-ncz8j" (OuterVolumeSpecName: "kube-api-access-ncz8j") pod "0b969ed7-9c77-4286-94c0-f7a0e7f3585e" (UID: "0b969ed7-9c77-4286-94c0-f7a0e7f3585e"). InnerVolumeSpecName "kube-api-access-ncz8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:43:49 crc kubenswrapper[4907]: I1129 14:43:49.552324 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b969ed7-9c77-4286-94c0-f7a0e7f3585e" (UID: "0b969ed7-9c77-4286-94c0-f7a0e7f3585e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:43:49 crc kubenswrapper[4907]: I1129 14:43:49.595531 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncz8j\" (UniqueName: \"kubernetes.io/projected/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-kube-api-access-ncz8j\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:49 crc kubenswrapper[4907]: I1129 14:43:49.595586 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:49 crc kubenswrapper[4907]: I1129 14:43:49.595606 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b969ed7-9c77-4286-94c0-f7a0e7f3585e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:50 crc kubenswrapper[4907]: I1129 14:43:50.222508 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24bfr" event={"ID":"0b969ed7-9c77-4286-94c0-f7a0e7f3585e","Type":"ContainerDied","Data":"e2b4457a6f9ad14f7021f9f18715a5c19df7790a64a2fc6f3dbd6850b356b0b0"} Nov 29 14:43:50 crc kubenswrapper[4907]: I1129 14:43:50.222586 4907 scope.go:117] "RemoveContainer" containerID="584b31c3a7973ec13582852b03599c0123a7c146d6d92abae74959c2b482c062" Nov 29 14:43:50 crc kubenswrapper[4907]: I1129 14:43:50.222684 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24bfr" Nov 29 14:43:50 crc kubenswrapper[4907]: I1129 14:43:50.275704 4907 scope.go:117] "RemoveContainer" containerID="3ce09f653308da0fb0ce5e6d6b55698a603e93af1a4bfdcf8244340009d58c02" Nov 29 14:43:50 crc kubenswrapper[4907]: I1129 14:43:50.303287 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-24bfr"] Nov 29 14:43:50 crc kubenswrapper[4907]: I1129 14:43:50.313176 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-24bfr"] Nov 29 14:43:50 crc kubenswrapper[4907]: I1129 14:43:50.335164 4907 scope.go:117] "RemoveContainer" containerID="d140931f14bbb63bad2ed5a970ce2a06600e03438ee51b14659dcce651cc865b" Nov 29 14:43:50 crc kubenswrapper[4907]: I1129 14:43:50.491190 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b969ed7-9c77-4286-94c0-f7a0e7f3585e" path="/var/lib/kubelet/pods/0b969ed7-9c77-4286-94c0-f7a0e7f3585e/volumes" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.490097 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.490753 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.673702 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-nf28s"] Nov 29 14:43:58 crc kubenswrapper[4907]: E1129 14:43:58.674034 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b969ed7-9c77-4286-94c0-f7a0e7f3585e" containerName="extract-utilities" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.674053 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b969ed7-9c77-4286-94c0-f7a0e7f3585e" containerName="extract-utilities" Nov 29 14:43:58 crc kubenswrapper[4907]: E1129 14:43:58.674091 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b969ed7-9c77-4286-94c0-f7a0e7f3585e" containerName="extract-content" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.674100 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b969ed7-9c77-4286-94c0-f7a0e7f3585e" containerName="extract-content" Nov 29 14:43:58 crc kubenswrapper[4907]: E1129 14:43:58.674108 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b969ed7-9c77-4286-94c0-f7a0e7f3585e" containerName="registry-server" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.674118 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b969ed7-9c77-4286-94c0-f7a0e7f3585e" containerName="registry-server" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.674275 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b969ed7-9c77-4286-94c0-f7a0e7f3585e" containerName="registry-server" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.674916 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.681904 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.681924 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.681954 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-vvjvx" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.682054 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.682290 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.704485 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.707192 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-nf28s"] Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.766984 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-config-openshift-service-cacrt\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.767058 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-config\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.767136 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/e49179a8-88b4-497d-8aef-0f855df6e236-sa-token\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.767286 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-metrics\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.767380 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/e49179a8-88b4-497d-8aef-0f855df6e236-datadir\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.767417 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e49179a8-88b4-497d-8aef-0f855df6e236-tmp\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.767499 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-collector-token\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.767557 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc2gn\" (UniqueName: \"kubernetes.io/projected/e49179a8-88b4-497d-8aef-0f855df6e236-kube-api-access-bc2gn\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.767711 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-collector-syslog-receiver\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.767786 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-entrypoint\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.767936 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-trusted-ca\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.826364 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-nf28s"] Nov 29 14:43:58 crc kubenswrapper[4907]: E1129 14:43:58.827009 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-bc2gn metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-nf28s" podUID="e49179a8-88b4-497d-8aef-0f855df6e236" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.868885 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-collector-syslog-receiver\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.868937 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-entrypoint\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.868975 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-trusted-ca\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.869007 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-config-openshift-service-cacrt\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.869024 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-config\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.869047 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/e49179a8-88b4-497d-8aef-0f855df6e236-sa-token\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.869067 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-metrics\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.869088 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/e49179a8-88b4-497d-8aef-0f855df6e236-datadir\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.869107 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e49179a8-88b4-497d-8aef-0f855df6e236-tmp\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.869128 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-collector-token\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.869147 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc2gn\" (UniqueName: \"kubernetes.io/projected/e49179a8-88b4-497d-8aef-0f855df6e236-kube-api-access-bc2gn\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.869225 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/e49179a8-88b4-497d-8aef-0f855df6e236-datadir\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: E1129 14:43:58.869324 4907 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Nov 29 14:43:58 crc kubenswrapper[4907]: E1129 14:43:58.869392 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-metrics podName:e49179a8-88b4-497d-8aef-0f855df6e236 nodeName:}" failed. No retries permitted until 2025-11-29 14:43:59.369376351 +0000 UTC m=+937.356214003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-metrics") pod "collector-nf28s" (UID: "e49179a8-88b4-497d-8aef-0f855df6e236") : secret "collector-metrics" not found Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.869883 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-entrypoint\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.870051 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-config-openshift-service-cacrt\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.870095 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-config\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: E1129 14:43:58.870183 4907 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Nov 29 14:43:58 crc kubenswrapper[4907]: E1129 14:43:58.870251 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-collector-syslog-receiver podName:e49179a8-88b4-497d-8aef-0f855df6e236 nodeName:}" failed. No retries permitted until 2025-11-29 14:43:59.370230565 +0000 UTC m=+937.357068287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-collector-syslog-receiver") pod "collector-nf28s" (UID: "e49179a8-88b4-497d-8aef-0f855df6e236") : secret "collector-syslog-receiver" not found Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.870342 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-trusted-ca\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.876250 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e49179a8-88b4-497d-8aef-0f855df6e236-tmp\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.887107 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc2gn\" (UniqueName: \"kubernetes.io/projected/e49179a8-88b4-497d-8aef-0f855df6e236-kube-api-access-bc2gn\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.887464 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-collector-token\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:58 crc kubenswrapper[4907]: I1129 14:43:58.887749 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/e49179a8-88b4-497d-8aef-0f855df6e236-sa-token\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.299906 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-nf28s" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.318790 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-nf28s" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.377297 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-metrics\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.377371 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-collector-syslog-receiver\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.383570 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-collector-syslog-receiver\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.385046 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-metrics\") pod \"collector-nf28s\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " pod="openshift-logging/collector-nf28s" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.478663 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e49179a8-88b4-497d-8aef-0f855df6e236-tmp\") pod \"e49179a8-88b4-497d-8aef-0f855df6e236\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.478763 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-collector-token\") pod \"e49179a8-88b4-497d-8aef-0f855df6e236\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.478805 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc2gn\" (UniqueName: \"kubernetes.io/projected/e49179a8-88b4-497d-8aef-0f855df6e236-kube-api-access-bc2gn\") pod \"e49179a8-88b4-497d-8aef-0f855df6e236\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.478885 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-entrypoint\") pod \"e49179a8-88b4-497d-8aef-0f855df6e236\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.478955 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/e49179a8-88b4-497d-8aef-0f855df6e236-sa-token\") pod \"e49179a8-88b4-497d-8aef-0f855df6e236\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.479019 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-config\") pod \"e49179a8-88b4-497d-8aef-0f855df6e236\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.479059 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/e49179a8-88b4-497d-8aef-0f855df6e236-datadir\") pod \"e49179a8-88b4-497d-8aef-0f855df6e236\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.479191 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-trusted-ca\") pod \"e49179a8-88b4-497d-8aef-0f855df6e236\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.479296 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-config-openshift-service-cacrt\") pod \"e49179a8-88b4-497d-8aef-0f855df6e236\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.479339 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-metrics\") pod \"e49179a8-88b4-497d-8aef-0f855df6e236\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.479392 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-collector-syslog-receiver\") pod \"e49179a8-88b4-497d-8aef-0f855df6e236\" (UID: \"e49179a8-88b4-497d-8aef-0f855df6e236\") " Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.479826 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "e49179a8-88b4-497d-8aef-0f855df6e236" (UID: "e49179a8-88b4-497d-8aef-0f855df6e236"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.480038 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e49179a8-88b4-497d-8aef-0f855df6e236-datadir" (OuterVolumeSpecName: "datadir") pod "e49179a8-88b4-497d-8aef-0f855df6e236" (UID: "e49179a8-88b4-497d-8aef-0f855df6e236"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.480722 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "e49179a8-88b4-497d-8aef-0f855df6e236" (UID: "e49179a8-88b4-497d-8aef-0f855df6e236"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.480847 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e49179a8-88b4-497d-8aef-0f855df6e236" (UID: "e49179a8-88b4-497d-8aef-0f855df6e236"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.481327 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-config" (OuterVolumeSpecName: "config") pod "e49179a8-88b4-497d-8aef-0f855df6e236" (UID: "e49179a8-88b4-497d-8aef-0f855df6e236"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.484943 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-collector-token" (OuterVolumeSpecName: "collector-token") pod "e49179a8-88b4-497d-8aef-0f855df6e236" (UID: "e49179a8-88b4-497d-8aef-0f855df6e236"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.485106 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49179a8-88b4-497d-8aef-0f855df6e236-kube-api-access-bc2gn" (OuterVolumeSpecName: "kube-api-access-bc2gn") pod "e49179a8-88b4-497d-8aef-0f855df6e236" (UID: "e49179a8-88b4-497d-8aef-0f855df6e236"). InnerVolumeSpecName "kube-api-access-bc2gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.485481 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-metrics" (OuterVolumeSpecName: "metrics") pod "e49179a8-88b4-497d-8aef-0f855df6e236" (UID: "e49179a8-88b4-497d-8aef-0f855df6e236"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.488630 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "e49179a8-88b4-497d-8aef-0f855df6e236" (UID: "e49179a8-88b4-497d-8aef-0f855df6e236"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.488691 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49179a8-88b4-497d-8aef-0f855df6e236-tmp" (OuterVolumeSpecName: "tmp") pod "e49179a8-88b4-497d-8aef-0f855df6e236" (UID: "e49179a8-88b4-497d-8aef-0f855df6e236"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.488734 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49179a8-88b4-497d-8aef-0f855df6e236-sa-token" (OuterVolumeSpecName: "sa-token") pod "e49179a8-88b4-497d-8aef-0f855df6e236" (UID: "e49179a8-88b4-497d-8aef-0f855df6e236"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.581956 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.582025 4907 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.582050 4907 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-metrics\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.582073 4907 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.582094 4907 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e49179a8-88b4-497d-8aef-0f855df6e236-tmp\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.582116 4907 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/e49179a8-88b4-497d-8aef-0f855df6e236-collector-token\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.582138 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc2gn\" (UniqueName: \"kubernetes.io/projected/e49179a8-88b4-497d-8aef-0f855df6e236-kube-api-access-bc2gn\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.582161 4907 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-entrypoint\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.582184 4907 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/e49179a8-88b4-497d-8aef-0f855df6e236-sa-token\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.582208 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e49179a8-88b4-497d-8aef-0f855df6e236-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:43:59 crc kubenswrapper[4907]: I1129 14:43:59.582231 4907 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/e49179a8-88b4-497d-8aef-0f855df6e236-datadir\") on node \"crc\" DevicePath \"\"" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.310006 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-nf28s" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.386761 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-nf28s"] Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.402079 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-nf28s"] Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.410039 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-45v85"] Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.413154 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.417686 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.418400 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.419168 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.419479 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-vvjvx" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.419657 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.432734 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.436325 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-45v85"] Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.488807 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e49179a8-88b4-497d-8aef-0f855df6e236" path="/var/lib/kubelet/pods/e49179a8-88b4-497d-8aef-0f855df6e236/volumes" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.501013 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-collector-token\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.501094 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-trusted-ca\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.501150 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-collector-syslog-receiver\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.501168 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-sa-token\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.501191 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-config-openshift-service-cacrt\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.501240 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnx58\" (UniqueName: \"kubernetes.io/projected/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-kube-api-access-vnx58\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.501331 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-tmp\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.501369 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-metrics\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.501390 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-datadir\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.501475 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-entrypoint\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.501532 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-config\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.603642 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-tmp\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.603703 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-metrics\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.603740 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-datadir\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.603803 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-entrypoint\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.603860 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-config\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.603904 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-datadir\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.603967 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-collector-token\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.604049 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-trusted-ca\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.604086 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-collector-syslog-receiver\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.604120 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-sa-token\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.604168 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-config-openshift-service-cacrt\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.604209 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnx58\" (UniqueName: \"kubernetes.io/projected/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-kube-api-access-vnx58\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.605089 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-config\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.606170 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-config-openshift-service-cacrt\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.606404 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-entrypoint\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.606562 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-trusted-ca\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.610169 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-collector-syslog-receiver\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.610498 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-metrics\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.611951 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-tmp\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.613094 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-collector-token\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.627066 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-sa-token\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.630544 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnx58\" (UniqueName: \"kubernetes.io/projected/6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a-kube-api-access-vnx58\") pod \"collector-45v85\" (UID: \"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a\") " pod="openshift-logging/collector-45v85" Nov 29 14:44:00 crc kubenswrapper[4907]: I1129 14:44:00.739729 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-45v85" Nov 29 14:44:01 crc kubenswrapper[4907]: I1129 14:44:01.067842 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-45v85"] Nov 29 14:44:01 crc kubenswrapper[4907]: I1129 14:44:01.321975 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-45v85" event={"ID":"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a","Type":"ContainerStarted","Data":"f8ac67faa1c149acb355e4325d3e273f9382d3b9329317309a389bc41c199056"} Nov 29 14:44:10 crc kubenswrapper[4907]: I1129 14:44:10.406849 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-45v85" event={"ID":"6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a","Type":"ContainerStarted","Data":"b502e2d3ba15685ebf9d6c8e2009ec42b05ccad3cedf715641837e114f316540"} Nov 29 14:44:10 crc kubenswrapper[4907]: I1129 14:44:10.445907 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-45v85" podStartSLOduration=2.078349212 podStartE2EDuration="10.445824524s" podCreationTimestamp="2025-11-29 14:44:00 +0000 UTC" firstStartedPulling="2025-11-29 14:44:01.08894578 +0000 UTC m=+939.075783462" lastFinishedPulling="2025-11-29 14:44:09.456421112 +0000 UTC m=+947.443258774" observedRunningTime="2025-11-29 14:44:10.436937453 +0000 UTC m=+948.423775145" watchObservedRunningTime="2025-11-29 14:44:10.445824524 +0000 UTC m=+948.432662226" Nov 29 14:44:28 crc kubenswrapper[4907]: I1129 14:44:28.490894 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:44:28 crc kubenswrapper[4907]: I1129 14:44:28.491623 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:44:38 crc kubenswrapper[4907]: I1129 14:44:38.140183 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb"] Nov 29 14:44:38 crc kubenswrapper[4907]: I1129 14:44:38.142167 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" Nov 29 14:44:38 crc kubenswrapper[4907]: I1129 14:44:38.144180 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 29 14:44:38 crc kubenswrapper[4907]: I1129 14:44:38.155177 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb"] Nov 29 14:44:38 crc kubenswrapper[4907]: I1129 14:44:38.333173 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e95c092-9faa-432a-8f5b-b4a831e12946-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb\" (UID: \"1e95c092-9faa-432a-8f5b-b4a831e12946\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" Nov 29 14:44:38 crc kubenswrapper[4907]: I1129 14:44:38.333281 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42ncf\" (UniqueName: \"kubernetes.io/projected/1e95c092-9faa-432a-8f5b-b4a831e12946-kube-api-access-42ncf\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb\" (UID: \"1e95c092-9faa-432a-8f5b-b4a831e12946\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" Nov 29 14:44:38 crc kubenswrapper[4907]: I1129 14:44:38.333327 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e95c092-9faa-432a-8f5b-b4a831e12946-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb\" (UID: \"1e95c092-9faa-432a-8f5b-b4a831e12946\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" Nov 29 14:44:38 crc kubenswrapper[4907]: I1129 14:44:38.435157 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e95c092-9faa-432a-8f5b-b4a831e12946-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb\" (UID: \"1e95c092-9faa-432a-8f5b-b4a831e12946\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" Nov 29 14:44:38 crc kubenswrapper[4907]: I1129 14:44:38.435251 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42ncf\" (UniqueName: \"kubernetes.io/projected/1e95c092-9faa-432a-8f5b-b4a831e12946-kube-api-access-42ncf\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb\" (UID: \"1e95c092-9faa-432a-8f5b-b4a831e12946\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" Nov 29 14:44:38 crc kubenswrapper[4907]: I1129 14:44:38.435298 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e95c092-9faa-432a-8f5b-b4a831e12946-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb\" (UID: \"1e95c092-9faa-432a-8f5b-b4a831e12946\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" Nov 29 14:44:38 crc kubenswrapper[4907]: I1129 14:44:38.435815 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e95c092-9faa-432a-8f5b-b4a831e12946-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb\" (UID: \"1e95c092-9faa-432a-8f5b-b4a831e12946\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" Nov 29 14:44:38 crc kubenswrapper[4907]: I1129 14:44:38.435852 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e95c092-9faa-432a-8f5b-b4a831e12946-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb\" (UID: \"1e95c092-9faa-432a-8f5b-b4a831e12946\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" Nov 29 14:44:38 crc kubenswrapper[4907]: I1129 14:44:38.456898 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42ncf\" (UniqueName: \"kubernetes.io/projected/1e95c092-9faa-432a-8f5b-b4a831e12946-kube-api-access-42ncf\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb\" (UID: \"1e95c092-9faa-432a-8f5b-b4a831e12946\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" Nov 29 14:44:38 crc kubenswrapper[4907]: I1129 14:44:38.469584 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" Nov 29 14:44:38 crc kubenswrapper[4907]: I1129 14:44:38.890355 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb"] Nov 29 14:44:39 crc kubenswrapper[4907]: I1129 14:44:39.659555 4907 generic.go:334] "Generic (PLEG): container finished" podID="1e95c092-9faa-432a-8f5b-b4a831e12946" containerID="e99217189e097964b2caabeffc2334d248ba2fb596f110c3308138004c88a1ef" exitCode=0 Nov 29 14:44:39 crc kubenswrapper[4907]: I1129 14:44:39.659653 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" event={"ID":"1e95c092-9faa-432a-8f5b-b4a831e12946","Type":"ContainerDied","Data":"e99217189e097964b2caabeffc2334d248ba2fb596f110c3308138004c88a1ef"} Nov 29 14:44:39 crc kubenswrapper[4907]: I1129 14:44:39.660087 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" event={"ID":"1e95c092-9faa-432a-8f5b-b4a831e12946","Type":"ContainerStarted","Data":"1480be13abed944662eaebe7bd8caf3e3363eb7049ef5e19ebec5c0a199b0e98"} Nov 29 14:44:45 crc kubenswrapper[4907]: I1129 14:44:45.722123 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" event={"ID":"1e95c092-9faa-432a-8f5b-b4a831e12946","Type":"ContainerStarted","Data":"cdff4493115982e5822e701865c3f99c7db54e16888fa3e93ec69899251967b1"} Nov 29 14:44:46 crc kubenswrapper[4907]: I1129 14:44:46.733987 4907 generic.go:334] "Generic (PLEG): container finished" podID="1e95c092-9faa-432a-8f5b-b4a831e12946" containerID="cdff4493115982e5822e701865c3f99c7db54e16888fa3e93ec69899251967b1" exitCode=0 Nov 29 14:44:46 crc kubenswrapper[4907]: I1129 14:44:46.734092 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" event={"ID":"1e95c092-9faa-432a-8f5b-b4a831e12946","Type":"ContainerDied","Data":"cdff4493115982e5822e701865c3f99c7db54e16888fa3e93ec69899251967b1"} Nov 29 14:44:47 crc kubenswrapper[4907]: I1129 14:44:47.747271 4907 generic.go:334] "Generic (PLEG): container finished" podID="1e95c092-9faa-432a-8f5b-b4a831e12946" containerID="58411141e3f8b2f91361b62c4026cdbf1173d4bb0cb6eb0d6fc90ce7fac2cab9" exitCode=0 Nov 29 14:44:47 crc kubenswrapper[4907]: I1129 14:44:47.747412 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" event={"ID":"1e95c092-9faa-432a-8f5b-b4a831e12946","Type":"ContainerDied","Data":"58411141e3f8b2f91361b62c4026cdbf1173d4bb0cb6eb0d6fc90ce7fac2cab9"} Nov 29 14:44:49 crc kubenswrapper[4907]: I1129 14:44:49.163236 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" Nov 29 14:44:49 crc kubenswrapper[4907]: I1129 14:44:49.268534 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e95c092-9faa-432a-8f5b-b4a831e12946-bundle\") pod \"1e95c092-9faa-432a-8f5b-b4a831e12946\" (UID: \"1e95c092-9faa-432a-8f5b-b4a831e12946\") " Nov 29 14:44:49 crc kubenswrapper[4907]: I1129 14:44:49.268639 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42ncf\" (UniqueName: \"kubernetes.io/projected/1e95c092-9faa-432a-8f5b-b4a831e12946-kube-api-access-42ncf\") pod \"1e95c092-9faa-432a-8f5b-b4a831e12946\" (UID: \"1e95c092-9faa-432a-8f5b-b4a831e12946\") " Nov 29 14:44:49 crc kubenswrapper[4907]: I1129 14:44:49.268736 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e95c092-9faa-432a-8f5b-b4a831e12946-util\") pod \"1e95c092-9faa-432a-8f5b-b4a831e12946\" (UID: \"1e95c092-9faa-432a-8f5b-b4a831e12946\") " Nov 29 14:44:49 crc kubenswrapper[4907]: I1129 14:44:49.269084 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e95c092-9faa-432a-8f5b-b4a831e12946-bundle" (OuterVolumeSpecName: "bundle") pod "1e95c092-9faa-432a-8f5b-b4a831e12946" (UID: "1e95c092-9faa-432a-8f5b-b4a831e12946"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:44:49 crc kubenswrapper[4907]: I1129 14:44:49.269358 4907 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e95c092-9faa-432a-8f5b-b4a831e12946-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:44:49 crc kubenswrapper[4907]: I1129 14:44:49.277827 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e95c092-9faa-432a-8f5b-b4a831e12946-kube-api-access-42ncf" (OuterVolumeSpecName: "kube-api-access-42ncf") pod "1e95c092-9faa-432a-8f5b-b4a831e12946" (UID: "1e95c092-9faa-432a-8f5b-b4a831e12946"). InnerVolumeSpecName "kube-api-access-42ncf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:44:49 crc kubenswrapper[4907]: I1129 14:44:49.278877 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e95c092-9faa-432a-8f5b-b4a831e12946-util" (OuterVolumeSpecName: "util") pod "1e95c092-9faa-432a-8f5b-b4a831e12946" (UID: "1e95c092-9faa-432a-8f5b-b4a831e12946"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:44:49 crc kubenswrapper[4907]: I1129 14:44:49.371468 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42ncf\" (UniqueName: \"kubernetes.io/projected/1e95c092-9faa-432a-8f5b-b4a831e12946-kube-api-access-42ncf\") on node \"crc\" DevicePath \"\"" Nov 29 14:44:49 crc kubenswrapper[4907]: I1129 14:44:49.371527 4907 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e95c092-9faa-432a-8f5b-b4a831e12946-util\") on node \"crc\" DevicePath \"\"" Nov 29 14:44:49 crc kubenswrapper[4907]: I1129 14:44:49.764960 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" event={"ID":"1e95c092-9faa-432a-8f5b-b4a831e12946","Type":"ContainerDied","Data":"1480be13abed944662eaebe7bd8caf3e3363eb7049ef5e19ebec5c0a199b0e98"} Nov 29 14:44:49 crc kubenswrapper[4907]: I1129 14:44:49.765019 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1480be13abed944662eaebe7bd8caf3e3363eb7049ef5e19ebec5c0a199b0e98" Nov 29 14:44:49 crc kubenswrapper[4907]: I1129 14:44:49.765046 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb" Nov 29 14:44:54 crc kubenswrapper[4907]: I1129 14:44:54.900469 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-p8tcl"] Nov 29 14:44:54 crc kubenswrapper[4907]: E1129 14:44:54.901371 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e95c092-9faa-432a-8f5b-b4a831e12946" containerName="pull" Nov 29 14:44:54 crc kubenswrapper[4907]: I1129 14:44:54.901387 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e95c092-9faa-432a-8f5b-b4a831e12946" containerName="pull" Nov 29 14:44:54 crc kubenswrapper[4907]: E1129 14:44:54.901408 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e95c092-9faa-432a-8f5b-b4a831e12946" containerName="extract" Nov 29 14:44:54 crc kubenswrapper[4907]: I1129 14:44:54.901416 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e95c092-9faa-432a-8f5b-b4a831e12946" containerName="extract" Nov 29 14:44:54 crc kubenswrapper[4907]: E1129 14:44:54.901429 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e95c092-9faa-432a-8f5b-b4a831e12946" containerName="util" Nov 29 14:44:54 crc kubenswrapper[4907]: I1129 14:44:54.901455 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e95c092-9faa-432a-8f5b-b4a831e12946" containerName="util" Nov 29 14:44:54 crc kubenswrapper[4907]: I1129 14:44:54.901619 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e95c092-9faa-432a-8f5b-b4a831e12946" containerName="extract" Nov 29 14:44:54 crc kubenswrapper[4907]: I1129 14:44:54.902250 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8tcl" Nov 29 14:44:54 crc kubenswrapper[4907]: I1129 14:44:54.903920 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 29 14:44:54 crc kubenswrapper[4907]: I1129 14:44:54.908492 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 29 14:44:54 crc kubenswrapper[4907]: I1129 14:44:54.910892 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-p8tcl"] Nov 29 14:44:54 crc kubenswrapper[4907]: I1129 14:44:54.911964 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-t2kn2" Nov 29 14:44:54 crc kubenswrapper[4907]: I1129 14:44:54.967213 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q46r\" (UniqueName: \"kubernetes.io/projected/19da125c-061c-4051-853f-38e13d9a6d5f-kube-api-access-8q46r\") pod \"nmstate-operator-5b5b58f5c8-p8tcl\" (UID: \"19da125c-061c-4051-853f-38e13d9a6d5f\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8tcl" Nov 29 14:44:55 crc kubenswrapper[4907]: I1129 14:44:55.067717 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q46r\" (UniqueName: \"kubernetes.io/projected/19da125c-061c-4051-853f-38e13d9a6d5f-kube-api-access-8q46r\") pod \"nmstate-operator-5b5b58f5c8-p8tcl\" (UID: \"19da125c-061c-4051-853f-38e13d9a6d5f\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8tcl" Nov 29 14:44:55 crc kubenswrapper[4907]: I1129 14:44:55.093359 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q46r\" (UniqueName: \"kubernetes.io/projected/19da125c-061c-4051-853f-38e13d9a6d5f-kube-api-access-8q46r\") pod \"nmstate-operator-5b5b58f5c8-p8tcl\" (UID: \"19da125c-061c-4051-853f-38e13d9a6d5f\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8tcl" Nov 29 14:44:55 crc kubenswrapper[4907]: I1129 14:44:55.225221 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8tcl" Nov 29 14:44:55 crc kubenswrapper[4907]: I1129 14:44:55.752010 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-p8tcl"] Nov 29 14:44:55 crc kubenswrapper[4907]: I1129 14:44:55.813967 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8tcl" event={"ID":"19da125c-061c-4051-853f-38e13d9a6d5f","Type":"ContainerStarted","Data":"2e2ae591aac91270f1f78eec3fc8dd6aaa58a84aa4206d0d4aa637de53d71dcb"} Nov 29 14:44:58 crc kubenswrapper[4907]: I1129 14:44:58.490529 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:44:58 crc kubenswrapper[4907]: I1129 14:44:58.491336 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:44:58 crc kubenswrapper[4907]: I1129 14:44:58.491389 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:44:58 crc kubenswrapper[4907]: I1129 14:44:58.492242 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81cf87bbb8090f9964b6b2dbf0b6be6946b5091a7113f8782940ac4da5885e64"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 14:44:58 crc kubenswrapper[4907]: I1129 14:44:58.492335 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://81cf87bbb8090f9964b6b2dbf0b6be6946b5091a7113f8782940ac4da5885e64" gracePeriod=600 Nov 29 14:44:58 crc kubenswrapper[4907]: I1129 14:44:58.857810 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="81cf87bbb8090f9964b6b2dbf0b6be6946b5091a7113f8782940ac4da5885e64" exitCode=0 Nov 29 14:44:58 crc kubenswrapper[4907]: I1129 14:44:58.857898 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"81cf87bbb8090f9964b6b2dbf0b6be6946b5091a7113f8782940ac4da5885e64"} Nov 29 14:44:58 crc kubenswrapper[4907]: I1129 14:44:58.858254 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"b8e5b56ee968d515ff618b3f298ba561ab70814c2dd33e300a89c15ce55549c1"} Nov 29 14:44:58 crc kubenswrapper[4907]: I1129 14:44:58.858296 4907 scope.go:117] "RemoveContainer" containerID="f6d0279b6c0a0b7cac049f6991025c0a86d66be3a78b2d46b37cde84b40abbc6" Nov 29 14:44:58 crc kubenswrapper[4907]: I1129 14:44:58.862419 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8tcl" event={"ID":"19da125c-061c-4051-853f-38e13d9a6d5f","Type":"ContainerStarted","Data":"6e53110651d5cd2e64b0167d04e2f17d5e13404d2b9e9e9fd9e31bb0253d9f8c"} Nov 29 14:44:58 crc kubenswrapper[4907]: I1129 14:44:58.916083 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-p8tcl" podStartSLOduration=2.634248178 podStartE2EDuration="4.916065082s" podCreationTimestamp="2025-11-29 14:44:54 +0000 UTC" firstStartedPulling="2025-11-29 14:44:55.766759972 +0000 UTC m=+993.753597625" lastFinishedPulling="2025-11-29 14:44:58.048576877 +0000 UTC m=+996.035414529" observedRunningTime="2025-11-29 14:44:58.908811328 +0000 UTC m=+996.895648990" watchObservedRunningTime="2025-11-29 14:44:58.916065082 +0000 UTC m=+996.902902754" Nov 29 14:44:59 crc kubenswrapper[4907]: I1129 14:44:59.980676 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-wrbbp"] Nov 29 14:44:59 crc kubenswrapper[4907]: I1129 14:44:59.982221 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wrbbp" Nov 29 14:44:59 crc kubenswrapper[4907]: I1129 14:44:59.985058 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-ngzmx" Nov 29 14:44:59 crc kubenswrapper[4907]: I1129 14:44:59.987915 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c"] Nov 29 14:44:59 crc kubenswrapper[4907]: I1129 14:44:59.989068 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c" Nov 29 14:44:59 crc kubenswrapper[4907]: I1129 14:44:59.995055 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 29 14:44:59 crc kubenswrapper[4907]: I1129 14:44:59.997230 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c"] Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.013740 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-w6v2c"] Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.014960 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.028737 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-wrbbp"] Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.157536 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mjbq\" (UniqueName: \"kubernetes.io/projected/3cc3523e-560b-4af9-a232-0c37f3343fac-kube-api-access-5mjbq\") pod \"nmstate-handler-w6v2c\" (UID: \"3cc3523e-560b-4af9-a232-0c37f3343fac\") " pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.157792 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb7gc\" (UniqueName: \"kubernetes.io/projected/2ce41195-9d9e-43aa-b0e7-77dbe09cc4cf-kube-api-access-nb7gc\") pod \"nmstate-metrics-7f946cbc9-wrbbp\" (UID: \"2ce41195-9d9e-43aa-b0e7-77dbe09cc4cf\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wrbbp" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.157888 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b0562e46-01ba-4930-a99f-92771a1804a9-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-s4p8c\" (UID: \"b0562e46-01ba-4930-a99f-92771a1804a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.157972 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65spr\" (UniqueName: \"kubernetes.io/projected/b0562e46-01ba-4930-a99f-92771a1804a9-kube-api-access-65spr\") pod \"nmstate-webhook-5f6d4c5ccb-s4p8c\" (UID: \"b0562e46-01ba-4930-a99f-92771a1804a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.158133 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3cc3523e-560b-4af9-a232-0c37f3343fac-ovs-socket\") pod \"nmstate-handler-w6v2c\" (UID: \"3cc3523e-560b-4af9-a232-0c37f3343fac\") " pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.158257 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3cc3523e-560b-4af9-a232-0c37f3343fac-dbus-socket\") pod \"nmstate-handler-w6v2c\" (UID: \"3cc3523e-560b-4af9-a232-0c37f3343fac\") " pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.158872 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3cc3523e-560b-4af9-a232-0c37f3343fac-nmstate-lock\") pod \"nmstate-handler-w6v2c\" (UID: \"3cc3523e-560b-4af9-a232-0c37f3343fac\") " pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.162064 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89"] Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.162892 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.170294 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-2ns2g" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.170390 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.177560 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.188170 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf"] Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.190258 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.200718 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.207160 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf"] Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.212155 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.261564 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/25f5423c-ea17-4f48-9552-7012ca67b559-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-r4k89\" (UID: \"25f5423c-ea17-4f48-9552-7012ca67b559\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.261636 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpml4\" (UniqueName: \"kubernetes.io/projected/25f5423c-ea17-4f48-9552-7012ca67b559-kube-api-access-lpml4\") pod \"nmstate-console-plugin-7fbb5f6569-r4k89\" (UID: \"25f5423c-ea17-4f48-9552-7012ca67b559\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.261690 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mjbq\" (UniqueName: \"kubernetes.io/projected/3cc3523e-560b-4af9-a232-0c37f3343fac-kube-api-access-5mjbq\") pod \"nmstate-handler-w6v2c\" (UID: \"3cc3523e-560b-4af9-a232-0c37f3343fac\") " pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.261711 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb7gc\" (UniqueName: \"kubernetes.io/projected/2ce41195-9d9e-43aa-b0e7-77dbe09cc4cf-kube-api-access-nb7gc\") pod \"nmstate-metrics-7f946cbc9-wrbbp\" (UID: \"2ce41195-9d9e-43aa-b0e7-77dbe09cc4cf\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wrbbp" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.261729 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b0562e46-01ba-4930-a99f-92771a1804a9-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-s4p8c\" (UID: \"b0562e46-01ba-4930-a99f-92771a1804a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.261764 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65spr\" (UniqueName: \"kubernetes.io/projected/b0562e46-01ba-4930-a99f-92771a1804a9-kube-api-access-65spr\") pod \"nmstate-webhook-5f6d4c5ccb-s4p8c\" (UID: \"b0562e46-01ba-4930-a99f-92771a1804a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.261787 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/25f5423c-ea17-4f48-9552-7012ca67b559-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-r4k89\" (UID: \"25f5423c-ea17-4f48-9552-7012ca67b559\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.261842 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3cc3523e-560b-4af9-a232-0c37f3343fac-ovs-socket\") pod \"nmstate-handler-w6v2c\" (UID: \"3cc3523e-560b-4af9-a232-0c37f3343fac\") " pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.261881 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3cc3523e-560b-4af9-a232-0c37f3343fac-dbus-socket\") pod \"nmstate-handler-w6v2c\" (UID: \"3cc3523e-560b-4af9-a232-0c37f3343fac\") " pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.261922 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3cc3523e-560b-4af9-a232-0c37f3343fac-nmstate-lock\") pod \"nmstate-handler-w6v2c\" (UID: \"3cc3523e-560b-4af9-a232-0c37f3343fac\") " pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.262031 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3cc3523e-560b-4af9-a232-0c37f3343fac-nmstate-lock\") pod \"nmstate-handler-w6v2c\" (UID: \"3cc3523e-560b-4af9-a232-0c37f3343fac\") " pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.262399 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3cc3523e-560b-4af9-a232-0c37f3343fac-ovs-socket\") pod \"nmstate-handler-w6v2c\" (UID: \"3cc3523e-560b-4af9-a232-0c37f3343fac\") " pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.262633 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3cc3523e-560b-4af9-a232-0c37f3343fac-dbus-socket\") pod \"nmstate-handler-w6v2c\" (UID: \"3cc3523e-560b-4af9-a232-0c37f3343fac\") " pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.273973 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b0562e46-01ba-4930-a99f-92771a1804a9-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-s4p8c\" (UID: \"b0562e46-01ba-4930-a99f-92771a1804a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.292338 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89"] Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.300900 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mjbq\" (UniqueName: \"kubernetes.io/projected/3cc3523e-560b-4af9-a232-0c37f3343fac-kube-api-access-5mjbq\") pod \"nmstate-handler-w6v2c\" (UID: \"3cc3523e-560b-4af9-a232-0c37f3343fac\") " pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.318748 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65spr\" (UniqueName: \"kubernetes.io/projected/b0562e46-01ba-4930-a99f-92771a1804a9-kube-api-access-65spr\") pod \"nmstate-webhook-5f6d4c5ccb-s4p8c\" (UID: \"b0562e46-01ba-4930-a99f-92771a1804a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.319376 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.338107 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.338278 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb7gc\" (UniqueName: \"kubernetes.io/projected/2ce41195-9d9e-43aa-b0e7-77dbe09cc4cf-kube-api-access-nb7gc\") pod \"nmstate-metrics-7f946cbc9-wrbbp\" (UID: \"2ce41195-9d9e-43aa-b0e7-77dbe09cc4cf\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wrbbp" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.363567 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-config-volume\") pod \"collect-profiles-29407125-8fwrf\" (UID: \"da3e28a2-d307-482f-b76f-a3b0d36bcc1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.363642 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-secret-volume\") pod \"collect-profiles-29407125-8fwrf\" (UID: \"da3e28a2-d307-482f-b76f-a3b0d36bcc1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.363673 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/25f5423c-ea17-4f48-9552-7012ca67b559-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-r4k89\" (UID: \"25f5423c-ea17-4f48-9552-7012ca67b559\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.363696 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpml4\" (UniqueName: \"kubernetes.io/projected/25f5423c-ea17-4f48-9552-7012ca67b559-kube-api-access-lpml4\") pod \"nmstate-console-plugin-7fbb5f6569-r4k89\" (UID: \"25f5423c-ea17-4f48-9552-7012ca67b559\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.363726 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/25f5423c-ea17-4f48-9552-7012ca67b559-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-r4k89\" (UID: \"25f5423c-ea17-4f48-9552-7012ca67b559\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.363745 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqrzh\" (UniqueName: \"kubernetes.io/projected/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-kube-api-access-gqrzh\") pod \"collect-profiles-29407125-8fwrf\" (UID: \"da3e28a2-d307-482f-b76f-a3b0d36bcc1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" Nov 29 14:45:00 crc kubenswrapper[4907]: E1129 14:45:00.364197 4907 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Nov 29 14:45:00 crc kubenswrapper[4907]: E1129 14:45:00.364305 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25f5423c-ea17-4f48-9552-7012ca67b559-plugin-serving-cert podName:25f5423c-ea17-4f48-9552-7012ca67b559 nodeName:}" failed. No retries permitted until 2025-11-29 14:45:00.864288848 +0000 UTC m=+998.851126500 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/25f5423c-ea17-4f48-9552-7012ca67b559-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-r4k89" (UID: "25f5423c-ea17-4f48-9552-7012ca67b559") : secret "plugin-serving-cert" not found Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.364584 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/25f5423c-ea17-4f48-9552-7012ca67b559-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-r4k89\" (UID: \"25f5423c-ea17-4f48-9552-7012ca67b559\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.397327 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpml4\" (UniqueName: \"kubernetes.io/projected/25f5423c-ea17-4f48-9552-7012ca67b559-kube-api-access-lpml4\") pod \"nmstate-console-plugin-7fbb5f6569-r4k89\" (UID: \"25f5423c-ea17-4f48-9552-7012ca67b559\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.465888 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-config-volume\") pod \"collect-profiles-29407125-8fwrf\" (UID: \"da3e28a2-d307-482f-b76f-a3b0d36bcc1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.465971 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-secret-volume\") pod \"collect-profiles-29407125-8fwrf\" (UID: \"da3e28a2-d307-482f-b76f-a3b0d36bcc1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.466026 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqrzh\" (UniqueName: \"kubernetes.io/projected/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-kube-api-access-gqrzh\") pod \"collect-profiles-29407125-8fwrf\" (UID: \"da3e28a2-d307-482f-b76f-a3b0d36bcc1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.467253 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-config-volume\") pod \"collect-profiles-29407125-8fwrf\" (UID: \"da3e28a2-d307-482f-b76f-a3b0d36bcc1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.469590 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-secret-volume\") pod \"collect-profiles-29407125-8fwrf\" (UID: \"da3e28a2-d307-482f-b76f-a3b0d36bcc1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.484143 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqrzh\" (UniqueName: \"kubernetes.io/projected/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-kube-api-access-gqrzh\") pod \"collect-profiles-29407125-8fwrf\" (UID: \"da3e28a2-d307-482f-b76f-a3b0d36bcc1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.497313 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b5dbd778c-bhss9"] Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.498431 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b5dbd778c-bhss9"] Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.498612 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.507392 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.599913 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wrbbp" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.671350 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-oauth-serving-cert\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.671717 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-config\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.671760 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-trusted-ca-bundle\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.671788 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-oauth-config\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.671856 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-serving-cert\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.671901 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-service-ca\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.671917 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndzkf\" (UniqueName: \"kubernetes.io/projected/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-kube-api-access-ndzkf\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.773090 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-config\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.773980 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-config\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.774034 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-trusted-ca-bundle\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.774061 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-oauth-config\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.774109 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-serving-cert\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.774135 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-service-ca\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.774152 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndzkf\" (UniqueName: \"kubernetes.io/projected/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-kube-api-access-ndzkf\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.774213 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-oauth-serving-cert\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.774752 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-oauth-serving-cert\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.775969 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-service-ca\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.776973 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-trusted-ca-bundle\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.780796 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-serving-cert\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.781071 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-oauth-config\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.790871 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndzkf\" (UniqueName: \"kubernetes.io/projected/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-kube-api-access-ndzkf\") pod \"console-6b5dbd778c-bhss9\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.824539 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.876539 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/25f5423c-ea17-4f48-9552-7012ca67b559-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-r4k89\" (UID: \"25f5423c-ea17-4f48-9552-7012ca67b559\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.880178 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/25f5423c-ea17-4f48-9552-7012ca67b559-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-r4k89\" (UID: \"25f5423c-ea17-4f48-9552-7012ca67b559\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89" Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.912944 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-w6v2c" event={"ID":"3cc3523e-560b-4af9-a232-0c37f3343fac","Type":"ContainerStarted","Data":"029000e8a8b2c040ec70d1140cafcbb801655d7f885a442b558c8a4233fe6984"} Nov 29 14:45:00 crc kubenswrapper[4907]: I1129 14:45:00.915497 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c"] Nov 29 14:45:00 crc kubenswrapper[4907]: W1129 14:45:00.923450 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0562e46_01ba_4930_a99f_92771a1804a9.slice/crio-f097c38e846522126ed94da982ad65ad2830198ae39b73daf59f660dbf604b07 WatchSource:0}: Error finding container f097c38e846522126ed94da982ad65ad2830198ae39b73daf59f660dbf604b07: Status 404 returned error can't find the container with id f097c38e846522126ed94da982ad65ad2830198ae39b73daf59f660dbf604b07 Nov 29 14:45:01 crc kubenswrapper[4907]: I1129 14:45:01.027308 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf"] Nov 29 14:45:01 crc kubenswrapper[4907]: W1129 14:45:01.030333 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda3e28a2_d307_482f_b76f_a3b0d36bcc1a.slice/crio-1bdbbfbd7bddc69c860901a51988996e313e489d580160c0942b030d8063566f WatchSource:0}: Error finding container 1bdbbfbd7bddc69c860901a51988996e313e489d580160c0942b030d8063566f: Status 404 returned error can't find the container with id 1bdbbfbd7bddc69c860901a51988996e313e489d580160c0942b030d8063566f Nov 29 14:45:01 crc kubenswrapper[4907]: I1129 14:45:01.077328 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89" Nov 29 14:45:01 crc kubenswrapper[4907]: I1129 14:45:01.109320 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-wrbbp"] Nov 29 14:45:01 crc kubenswrapper[4907]: I1129 14:45:01.246751 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b5dbd778c-bhss9"] Nov 29 14:45:01 crc kubenswrapper[4907]: I1129 14:45:01.544024 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89"] Nov 29 14:45:01 crc kubenswrapper[4907]: I1129 14:45:01.934041 4907 generic.go:334] "Generic (PLEG): container finished" podID="da3e28a2-d307-482f-b76f-a3b0d36bcc1a" containerID="bc3a1b63452d37ace577ca8b60f10ee63713a31408b237fab5e498dce9eaa921" exitCode=0 Nov 29 14:45:01 crc kubenswrapper[4907]: I1129 14:45:01.934159 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" event={"ID":"da3e28a2-d307-482f-b76f-a3b0d36bcc1a","Type":"ContainerDied","Data":"bc3a1b63452d37ace577ca8b60f10ee63713a31408b237fab5e498dce9eaa921"} Nov 29 14:45:01 crc kubenswrapper[4907]: I1129 14:45:01.934434 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" event={"ID":"da3e28a2-d307-482f-b76f-a3b0d36bcc1a","Type":"ContainerStarted","Data":"1bdbbfbd7bddc69c860901a51988996e313e489d580160c0942b030d8063566f"} Nov 29 14:45:01 crc kubenswrapper[4907]: I1129 14:45:01.936289 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c" event={"ID":"b0562e46-01ba-4930-a99f-92771a1804a9","Type":"ContainerStarted","Data":"f097c38e846522126ed94da982ad65ad2830198ae39b73daf59f660dbf604b07"} Nov 29 14:45:01 crc kubenswrapper[4907]: I1129 14:45:01.938718 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b5dbd778c-bhss9" event={"ID":"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1","Type":"ContainerStarted","Data":"6e2efe74a40325503da6bcaeba21c72fdef5b0d06629709671596369b6dd6106"} Nov 29 14:45:01 crc kubenswrapper[4907]: I1129 14:45:01.938793 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b5dbd778c-bhss9" event={"ID":"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1","Type":"ContainerStarted","Data":"73a5782c2d5e14a6436d7cb098887d7e400d4d19553f9915d6e7d163283245c8"} Nov 29 14:45:01 crc kubenswrapper[4907]: I1129 14:45:01.940909 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89" event={"ID":"25f5423c-ea17-4f48-9552-7012ca67b559","Type":"ContainerStarted","Data":"17d25ee4e8ce0cc2bc58a8b517d381f3e6bc3862b53d1e33eb9d02b290964b56"} Nov 29 14:45:01 crc kubenswrapper[4907]: I1129 14:45:01.942228 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wrbbp" event={"ID":"2ce41195-9d9e-43aa-b0e7-77dbe09cc4cf","Type":"ContainerStarted","Data":"a03d96168850549f48e7cb00618d1b7924541d315f702108727fd8d89c7815b7"} Nov 29 14:45:01 crc kubenswrapper[4907]: I1129 14:45:01.987632 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b5dbd778c-bhss9" podStartSLOduration=1.9875887190000001 podStartE2EDuration="1.987588719s" podCreationTimestamp="2025-11-29 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:45:01.985217072 +0000 UTC m=+999.972054764" watchObservedRunningTime="2025-11-29 14:45:01.987588719 +0000 UTC m=+999.974426381" Nov 29 14:45:03 crc kubenswrapper[4907]: I1129 14:45:03.925329 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" Nov 29 14:45:03 crc kubenswrapper[4907]: I1129 14:45:03.966864 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" event={"ID":"da3e28a2-d307-482f-b76f-a3b0d36bcc1a","Type":"ContainerDied","Data":"1bdbbfbd7bddc69c860901a51988996e313e489d580160c0942b030d8063566f"} Nov 29 14:45:03 crc kubenswrapper[4907]: I1129 14:45:03.966907 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bdbbfbd7bddc69c860901a51988996e313e489d580160c0942b030d8063566f" Nov 29 14:45:03 crc kubenswrapper[4907]: I1129 14:45:03.966961 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf" Nov 29 14:45:03 crc kubenswrapper[4907]: I1129 14:45:03.987559 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-secret-volume\") pod \"da3e28a2-d307-482f-b76f-a3b0d36bcc1a\" (UID: \"da3e28a2-d307-482f-b76f-a3b0d36bcc1a\") " Nov 29 14:45:03 crc kubenswrapper[4907]: I1129 14:45:03.987681 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqrzh\" (UniqueName: \"kubernetes.io/projected/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-kube-api-access-gqrzh\") pod \"da3e28a2-d307-482f-b76f-a3b0d36bcc1a\" (UID: \"da3e28a2-d307-482f-b76f-a3b0d36bcc1a\") " Nov 29 14:45:03 crc kubenswrapper[4907]: I1129 14:45:03.987770 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-config-volume\") pod \"da3e28a2-d307-482f-b76f-a3b0d36bcc1a\" (UID: \"da3e28a2-d307-482f-b76f-a3b0d36bcc1a\") " Nov 29 14:45:03 crc kubenswrapper[4907]: I1129 14:45:03.989133 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-config-volume" (OuterVolumeSpecName: "config-volume") pod "da3e28a2-d307-482f-b76f-a3b0d36bcc1a" (UID: "da3e28a2-d307-482f-b76f-a3b0d36bcc1a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:45:03 crc kubenswrapper[4907]: I1129 14:45:03.995733 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "da3e28a2-d307-482f-b76f-a3b0d36bcc1a" (UID: "da3e28a2-d307-482f-b76f-a3b0d36bcc1a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:45:04 crc kubenswrapper[4907]: I1129 14:45:04.004794 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-kube-api-access-gqrzh" (OuterVolumeSpecName: "kube-api-access-gqrzh") pod "da3e28a2-d307-482f-b76f-a3b0d36bcc1a" (UID: "da3e28a2-d307-482f-b76f-a3b0d36bcc1a"). InnerVolumeSpecName "kube-api-access-gqrzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:45:04 crc kubenswrapper[4907]: I1129 14:45:04.090697 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 14:45:04 crc kubenswrapper[4907]: I1129 14:45:04.090739 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 14:45:04 crc kubenswrapper[4907]: I1129 14:45:04.090785 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqrzh\" (UniqueName: \"kubernetes.io/projected/da3e28a2-d307-482f-b76f-a3b0d36bcc1a-kube-api-access-gqrzh\") on node \"crc\" DevicePath \"\"" Nov 29 14:45:04 crc kubenswrapper[4907]: I1129 14:45:04.977756 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wrbbp" event={"ID":"2ce41195-9d9e-43aa-b0e7-77dbe09cc4cf","Type":"ContainerStarted","Data":"5151654d1573641edf52d3fd3a2c332257484862f5cab59c34be653e86f11c97"} Nov 29 14:45:04 crc kubenswrapper[4907]: I1129 14:45:04.979735 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c" event={"ID":"b0562e46-01ba-4930-a99f-92771a1804a9","Type":"ContainerStarted","Data":"e77c7c106fc3ea53c195d1607a9ffb96e4264d75a70950d832fcc025a8921fac"} Nov 29 14:45:04 crc kubenswrapper[4907]: I1129 14:45:04.979881 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c" Nov 29 14:45:04 crc kubenswrapper[4907]: I1129 14:45:04.981587 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89" event={"ID":"25f5423c-ea17-4f48-9552-7012ca67b559","Type":"ContainerStarted","Data":"9cd7b09620a3b7d4ad9302d15e65bde3ed8bb1bbaf7a87913bd5613b81d68d71"} Nov 29 14:45:05 crc kubenswrapper[4907]: I1129 14:45:05.008561 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c" podStartSLOduration=2.424762883 podStartE2EDuration="6.00852924s" podCreationTimestamp="2025-11-29 14:44:59 +0000 UTC" firstStartedPulling="2025-11-29 14:45:00.925484999 +0000 UTC m=+998.912322651" lastFinishedPulling="2025-11-29 14:45:04.509251356 +0000 UTC m=+1002.496089008" observedRunningTime="2025-11-29 14:45:04.999204857 +0000 UTC m=+1002.986042539" watchObservedRunningTime="2025-11-29 14:45:05.00852924 +0000 UTC m=+1002.995366932" Nov 29 14:45:05 crc kubenswrapper[4907]: I1129 14:45:05.026061 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-r4k89" podStartSLOduration=2.075144127 podStartE2EDuration="5.026034214s" podCreationTimestamp="2025-11-29 14:45:00 +0000 UTC" firstStartedPulling="2025-11-29 14:45:01.559324786 +0000 UTC m=+999.546162438" lastFinishedPulling="2025-11-29 14:45:04.510214873 +0000 UTC m=+1002.497052525" observedRunningTime="2025-11-29 14:45:05.019955592 +0000 UTC m=+1003.006793264" watchObservedRunningTime="2025-11-29 14:45:05.026034214 +0000 UTC m=+1003.012871876" Nov 29 14:45:05 crc kubenswrapper[4907]: I1129 14:45:05.992788 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-w6v2c" event={"ID":"3cc3523e-560b-4af9-a232-0c37f3343fac","Type":"ContainerStarted","Data":"cc23cbebf2edd989ca523eded6cccd208603f66f6f550a8ab007caae27c458d5"} Nov 29 14:45:06 crc kubenswrapper[4907]: I1129 14:45:06.021383 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-w6v2c" podStartSLOduration=2.927495734 podStartE2EDuration="7.021360302s" podCreationTimestamp="2025-11-29 14:44:59 +0000 UTC" firstStartedPulling="2025-11-29 14:45:00.446615819 +0000 UTC m=+998.433453471" lastFinishedPulling="2025-11-29 14:45:04.540480347 +0000 UTC m=+1002.527318039" observedRunningTime="2025-11-29 14:45:06.014681834 +0000 UTC m=+1004.001519486" watchObservedRunningTime="2025-11-29 14:45:06.021360302 +0000 UTC m=+1004.008197954" Nov 29 14:45:07 crc kubenswrapper[4907]: I1129 14:45:07.005912 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:09 crc kubenswrapper[4907]: I1129 14:45:09.029747 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wrbbp" event={"ID":"2ce41195-9d9e-43aa-b0e7-77dbe09cc4cf","Type":"ContainerStarted","Data":"1101f9f47e530149a476da42a5975bdfbd9ae2ef0327a156ceec044d40c916fb"} Nov 29 14:45:09 crc kubenswrapper[4907]: I1129 14:45:09.057689 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-wrbbp" podStartSLOduration=3.310504582 podStartE2EDuration="10.057666786s" podCreationTimestamp="2025-11-29 14:44:59 +0000 UTC" firstStartedPulling="2025-11-29 14:45:01.137018192 +0000 UTC m=+999.123855844" lastFinishedPulling="2025-11-29 14:45:07.884180396 +0000 UTC m=+1005.871018048" observedRunningTime="2025-11-29 14:45:09.053094137 +0000 UTC m=+1007.039931819" watchObservedRunningTime="2025-11-29 14:45:09.057666786 +0000 UTC m=+1007.044504478" Nov 29 14:45:10 crc kubenswrapper[4907]: I1129 14:45:10.380192 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-w6v2c" Nov 29 14:45:10 crc kubenswrapper[4907]: I1129 14:45:10.825234 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:10 crc kubenswrapper[4907]: I1129 14:45:10.825799 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:10 crc kubenswrapper[4907]: I1129 14:45:10.834049 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:11 crc kubenswrapper[4907]: I1129 14:45:11.049524 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:45:11 crc kubenswrapper[4907]: I1129 14:45:11.123837 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5db8778c9-qk2w4"] Nov 29 14:45:20 crc kubenswrapper[4907]: I1129 14:45:20.330185 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s4p8c" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.179150 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-5db8778c9-qk2w4" podUID="a94fe315-0819-408a-948e-ea4ce03dde60" containerName="console" containerID="cri-o://c0b07e55d49d34b8a71b9a3d9185fcd1e2a66060042255152c4ee8c991c944d2" gracePeriod=15 Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.350716 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5db8778c9-qk2w4_a94fe315-0819-408a-948e-ea4ce03dde60/console/0.log" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.351408 4907 generic.go:334] "Generic (PLEG): container finished" podID="a94fe315-0819-408a-948e-ea4ce03dde60" containerID="c0b07e55d49d34b8a71b9a3d9185fcd1e2a66060042255152c4ee8c991c944d2" exitCode=2 Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.351501 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5db8778c9-qk2w4" event={"ID":"a94fe315-0819-408a-948e-ea4ce03dde60","Type":"ContainerDied","Data":"c0b07e55d49d34b8a71b9a3d9185fcd1e2a66060042255152c4ee8c991c944d2"} Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.648358 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5db8778c9-qk2w4_a94fe315-0819-408a-948e-ea4ce03dde60/console/0.log" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.648780 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.722958 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-oauth-serving-cert\") pod \"a94fe315-0819-408a-948e-ea4ce03dde60\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.723029 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-trusted-ca-bundle\") pod \"a94fe315-0819-408a-948e-ea4ce03dde60\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.723076 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-service-ca\") pod \"a94fe315-0819-408a-948e-ea4ce03dde60\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.723100 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4knkk\" (UniqueName: \"kubernetes.io/projected/a94fe315-0819-408a-948e-ea4ce03dde60-kube-api-access-4knkk\") pod \"a94fe315-0819-408a-948e-ea4ce03dde60\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.723203 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a94fe315-0819-408a-948e-ea4ce03dde60-console-serving-cert\") pod \"a94fe315-0819-408a-948e-ea4ce03dde60\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.723243 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a94fe315-0819-408a-948e-ea4ce03dde60-console-oauth-config\") pod \"a94fe315-0819-408a-948e-ea4ce03dde60\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.723306 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-console-config\") pod \"a94fe315-0819-408a-948e-ea4ce03dde60\" (UID: \"a94fe315-0819-408a-948e-ea4ce03dde60\") " Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.724156 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a94fe315-0819-408a-948e-ea4ce03dde60" (UID: "a94fe315-0819-408a-948e-ea4ce03dde60"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.724242 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-console-config" (OuterVolumeSpecName: "console-config") pod "a94fe315-0819-408a-948e-ea4ce03dde60" (UID: "a94fe315-0819-408a-948e-ea4ce03dde60"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.724340 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-service-ca" (OuterVolumeSpecName: "service-ca") pod "a94fe315-0819-408a-948e-ea4ce03dde60" (UID: "a94fe315-0819-408a-948e-ea4ce03dde60"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.724467 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a94fe315-0819-408a-948e-ea4ce03dde60" (UID: "a94fe315-0819-408a-948e-ea4ce03dde60"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.729946 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a94fe315-0819-408a-948e-ea4ce03dde60-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a94fe315-0819-408a-948e-ea4ce03dde60" (UID: "a94fe315-0819-408a-948e-ea4ce03dde60"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.732721 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a94fe315-0819-408a-948e-ea4ce03dde60-kube-api-access-4knkk" (OuterVolumeSpecName: "kube-api-access-4knkk") pod "a94fe315-0819-408a-948e-ea4ce03dde60" (UID: "a94fe315-0819-408a-948e-ea4ce03dde60"). InnerVolumeSpecName "kube-api-access-4knkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.734685 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a94fe315-0819-408a-948e-ea4ce03dde60-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a94fe315-0819-408a-948e-ea4ce03dde60" (UID: "a94fe315-0819-408a-948e-ea4ce03dde60"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.825646 4907 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-console-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.825686 4907 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.825700 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.825713 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a94fe315-0819-408a-948e-ea4ce03dde60-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.825726 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4knkk\" (UniqueName: \"kubernetes.io/projected/a94fe315-0819-408a-948e-ea4ce03dde60-kube-api-access-4knkk\") on node \"crc\" DevicePath \"\"" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.825740 4907 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a94fe315-0819-408a-948e-ea4ce03dde60-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:45:36 crc kubenswrapper[4907]: I1129 14:45:36.825751 4907 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a94fe315-0819-408a-948e-ea4ce03dde60-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:45:37 crc kubenswrapper[4907]: I1129 14:45:37.362178 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5db8778c9-qk2w4_a94fe315-0819-408a-948e-ea4ce03dde60/console/0.log" Nov 29 14:45:37 crc kubenswrapper[4907]: I1129 14:45:37.362681 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5db8778c9-qk2w4" event={"ID":"a94fe315-0819-408a-948e-ea4ce03dde60","Type":"ContainerDied","Data":"4ed6ba3028f60707113b0892a6ef2730d6501c43e5a7385fd25428c718e09236"} Nov 29 14:45:37 crc kubenswrapper[4907]: I1129 14:45:37.362762 4907 scope.go:117] "RemoveContainer" containerID="c0b07e55d49d34b8a71b9a3d9185fcd1e2a66060042255152c4ee8c991c944d2" Nov 29 14:45:37 crc kubenswrapper[4907]: I1129 14:45:37.362971 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5db8778c9-qk2w4" Nov 29 14:45:37 crc kubenswrapper[4907]: I1129 14:45:37.417147 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5db8778c9-qk2w4"] Nov 29 14:45:37 crc kubenswrapper[4907]: I1129 14:45:37.429848 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5db8778c9-qk2w4"] Nov 29 14:45:38 crc kubenswrapper[4907]: I1129 14:45:38.490379 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a94fe315-0819-408a-948e-ea4ce03dde60" path="/var/lib/kubelet/pods/a94fe315-0819-408a-948e-ea4ce03dde60/volumes" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.559745 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk"] Nov 29 14:45:39 crc kubenswrapper[4907]: E1129 14:45:39.560617 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94fe315-0819-408a-948e-ea4ce03dde60" containerName="console" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.560634 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94fe315-0819-408a-948e-ea4ce03dde60" containerName="console" Nov 29 14:45:39 crc kubenswrapper[4907]: E1129 14:45:39.560644 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3e28a2-d307-482f-b76f-a3b0d36bcc1a" containerName="collect-profiles" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.560650 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3e28a2-d307-482f-b76f-a3b0d36bcc1a" containerName="collect-profiles" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.560812 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="da3e28a2-d307-482f-b76f-a3b0d36bcc1a" containerName="collect-profiles" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.560828 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a94fe315-0819-408a-948e-ea4ce03dde60" containerName="console" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.562263 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.566514 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.571623 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk"] Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.617135 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bea1f3c-2bd3-4013-a502-9b9ed934f733-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk\" (UID: \"1bea1f3c-2bd3-4013-a502-9b9ed934f733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.617275 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bea1f3c-2bd3-4013-a502-9b9ed934f733-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk\" (UID: \"1bea1f3c-2bd3-4013-a502-9b9ed934f733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.617355 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wzcf\" (UniqueName: \"kubernetes.io/projected/1bea1f3c-2bd3-4013-a502-9b9ed934f733-kube-api-access-7wzcf\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk\" (UID: \"1bea1f3c-2bd3-4013-a502-9b9ed934f733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.720106 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bea1f3c-2bd3-4013-a502-9b9ed934f733-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk\" (UID: \"1bea1f3c-2bd3-4013-a502-9b9ed934f733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.720229 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wzcf\" (UniqueName: \"kubernetes.io/projected/1bea1f3c-2bd3-4013-a502-9b9ed934f733-kube-api-access-7wzcf\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk\" (UID: \"1bea1f3c-2bd3-4013-a502-9b9ed934f733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.720328 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bea1f3c-2bd3-4013-a502-9b9ed934f733-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk\" (UID: \"1bea1f3c-2bd3-4013-a502-9b9ed934f733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.721194 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bea1f3c-2bd3-4013-a502-9b9ed934f733-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk\" (UID: \"1bea1f3c-2bd3-4013-a502-9b9ed934f733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.721515 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bea1f3c-2bd3-4013-a502-9b9ed934f733-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk\" (UID: \"1bea1f3c-2bd3-4013-a502-9b9ed934f733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.755005 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wzcf\" (UniqueName: \"kubernetes.io/projected/1bea1f3c-2bd3-4013-a502-9b9ed934f733-kube-api-access-7wzcf\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk\" (UID: \"1bea1f3c-2bd3-4013-a502-9b9ed934f733\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" Nov 29 14:45:39 crc kubenswrapper[4907]: I1129 14:45:39.934582 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" Nov 29 14:45:40 crc kubenswrapper[4907]: I1129 14:45:40.236398 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk"] Nov 29 14:45:40 crc kubenswrapper[4907]: I1129 14:45:40.390085 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" event={"ID":"1bea1f3c-2bd3-4013-a502-9b9ed934f733","Type":"ContainerStarted","Data":"158d40153d2516a255b1080d8b58bde600d21a3c1f1edcd638c6251e9ff7d8b9"} Nov 29 14:45:41 crc kubenswrapper[4907]: I1129 14:45:41.401938 4907 generic.go:334] "Generic (PLEG): container finished" podID="1bea1f3c-2bd3-4013-a502-9b9ed934f733" containerID="9bf977c3185adda290b29317ad4bc4049adb5c0010ffd2dcbd288181334e0670" exitCode=0 Nov 29 14:45:41 crc kubenswrapper[4907]: I1129 14:45:41.402263 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" event={"ID":"1bea1f3c-2bd3-4013-a502-9b9ed934f733","Type":"ContainerDied","Data":"9bf977c3185adda290b29317ad4bc4049adb5c0010ffd2dcbd288181334e0670"} Nov 29 14:45:41 crc kubenswrapper[4907]: I1129 14:45:41.406207 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 14:45:43 crc kubenswrapper[4907]: I1129 14:45:43.429853 4907 generic.go:334] "Generic (PLEG): container finished" podID="1bea1f3c-2bd3-4013-a502-9b9ed934f733" containerID="43e16b315c46293cb1d6c0c17ff36dd18a042de811523c39746851fa46c2c47f" exitCode=0 Nov 29 14:45:43 crc kubenswrapper[4907]: I1129 14:45:43.429983 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" event={"ID":"1bea1f3c-2bd3-4013-a502-9b9ed934f733","Type":"ContainerDied","Data":"43e16b315c46293cb1d6c0c17ff36dd18a042de811523c39746851fa46c2c47f"} Nov 29 14:45:44 crc kubenswrapper[4907]: I1129 14:45:44.442706 4907 generic.go:334] "Generic (PLEG): container finished" podID="1bea1f3c-2bd3-4013-a502-9b9ed934f733" containerID="ae1d2aa383db4672320b05ac208c4d9fae3245d98ac19805094b5ae9179a5f39" exitCode=0 Nov 29 14:45:44 crc kubenswrapper[4907]: I1129 14:45:44.442823 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" event={"ID":"1bea1f3c-2bd3-4013-a502-9b9ed934f733","Type":"ContainerDied","Data":"ae1d2aa383db4672320b05ac208c4d9fae3245d98ac19805094b5ae9179a5f39"} Nov 29 14:45:45 crc kubenswrapper[4907]: I1129 14:45:45.780952 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" Nov 29 14:45:45 crc kubenswrapper[4907]: I1129 14:45:45.962334 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wzcf\" (UniqueName: \"kubernetes.io/projected/1bea1f3c-2bd3-4013-a502-9b9ed934f733-kube-api-access-7wzcf\") pod \"1bea1f3c-2bd3-4013-a502-9b9ed934f733\" (UID: \"1bea1f3c-2bd3-4013-a502-9b9ed934f733\") " Nov 29 14:45:45 crc kubenswrapper[4907]: I1129 14:45:45.962519 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bea1f3c-2bd3-4013-a502-9b9ed934f733-bundle\") pod \"1bea1f3c-2bd3-4013-a502-9b9ed934f733\" (UID: \"1bea1f3c-2bd3-4013-a502-9b9ed934f733\") " Nov 29 14:45:45 crc kubenswrapper[4907]: I1129 14:45:45.962560 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bea1f3c-2bd3-4013-a502-9b9ed934f733-util\") pod \"1bea1f3c-2bd3-4013-a502-9b9ed934f733\" (UID: \"1bea1f3c-2bd3-4013-a502-9b9ed934f733\") " Nov 29 14:45:45 crc kubenswrapper[4907]: I1129 14:45:45.964420 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bea1f3c-2bd3-4013-a502-9b9ed934f733-bundle" (OuterVolumeSpecName: "bundle") pod "1bea1f3c-2bd3-4013-a502-9b9ed934f733" (UID: "1bea1f3c-2bd3-4013-a502-9b9ed934f733"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:45:45 crc kubenswrapper[4907]: I1129 14:45:45.974789 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bea1f3c-2bd3-4013-a502-9b9ed934f733-kube-api-access-7wzcf" (OuterVolumeSpecName: "kube-api-access-7wzcf") pod "1bea1f3c-2bd3-4013-a502-9b9ed934f733" (UID: "1bea1f3c-2bd3-4013-a502-9b9ed934f733"). InnerVolumeSpecName "kube-api-access-7wzcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:45:46 crc kubenswrapper[4907]: I1129 14:45:46.064694 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wzcf\" (UniqueName: \"kubernetes.io/projected/1bea1f3c-2bd3-4013-a502-9b9ed934f733-kube-api-access-7wzcf\") on node \"crc\" DevicePath \"\"" Nov 29 14:45:46 crc kubenswrapper[4907]: I1129 14:45:46.064743 4907 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bea1f3c-2bd3-4013-a502-9b9ed934f733-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:45:46 crc kubenswrapper[4907]: I1129 14:45:46.107014 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bea1f3c-2bd3-4013-a502-9b9ed934f733-util" (OuterVolumeSpecName: "util") pod "1bea1f3c-2bd3-4013-a502-9b9ed934f733" (UID: "1bea1f3c-2bd3-4013-a502-9b9ed934f733"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:45:46 crc kubenswrapper[4907]: I1129 14:45:46.166560 4907 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bea1f3c-2bd3-4013-a502-9b9ed934f733-util\") on node \"crc\" DevicePath \"\"" Nov 29 14:45:46 crc kubenswrapper[4907]: I1129 14:45:46.469568 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" event={"ID":"1bea1f3c-2bd3-4013-a502-9b9ed934f733","Type":"ContainerDied","Data":"158d40153d2516a255b1080d8b58bde600d21a3c1f1edcd638c6251e9ff7d8b9"} Nov 29 14:45:46 crc kubenswrapper[4907]: I1129 14:45:46.469616 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="158d40153d2516a255b1080d8b58bde600d21a3c1f1edcd638c6251e9ff7d8b9" Nov 29 14:45:46 crc kubenswrapper[4907]: I1129 14:45:46.469671 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.071047 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv"] Nov 29 14:45:55 crc kubenswrapper[4907]: E1129 14:45:55.072319 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bea1f3c-2bd3-4013-a502-9b9ed934f733" containerName="pull" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.072336 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bea1f3c-2bd3-4013-a502-9b9ed934f733" containerName="pull" Nov 29 14:45:55 crc kubenswrapper[4907]: E1129 14:45:55.072348 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bea1f3c-2bd3-4013-a502-9b9ed934f733" containerName="util" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.072356 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bea1f3c-2bd3-4013-a502-9b9ed934f733" containerName="util" Nov 29 14:45:55 crc kubenswrapper[4907]: E1129 14:45:55.072398 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bea1f3c-2bd3-4013-a502-9b9ed934f733" containerName="extract" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.072405 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bea1f3c-2bd3-4013-a502-9b9ed934f733" containerName="extract" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.072589 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bea1f3c-2bd3-4013-a502-9b9ed934f733" containerName="extract" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.073268 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.077572 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.077882 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.077941 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.078261 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-vxh7c" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.078908 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.097288 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv"] Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.167857 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89e52ee4-247d-402e-9c42-8f39e8529314-apiservice-cert\") pod \"metallb-operator-controller-manager-6dfdbf684f-xmtsv\" (UID: \"89e52ee4-247d-402e-9c42-8f39e8529314\") " pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.167934 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6qzm\" (UniqueName: \"kubernetes.io/projected/89e52ee4-247d-402e-9c42-8f39e8529314-kube-api-access-h6qzm\") pod \"metallb-operator-controller-manager-6dfdbf684f-xmtsv\" (UID: \"89e52ee4-247d-402e-9c42-8f39e8529314\") " pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.167976 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89e52ee4-247d-402e-9c42-8f39e8529314-webhook-cert\") pod \"metallb-operator-controller-manager-6dfdbf684f-xmtsv\" (UID: \"89e52ee4-247d-402e-9c42-8f39e8529314\") " pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.269742 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89e52ee4-247d-402e-9c42-8f39e8529314-apiservice-cert\") pod \"metallb-operator-controller-manager-6dfdbf684f-xmtsv\" (UID: \"89e52ee4-247d-402e-9c42-8f39e8529314\") " pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.269800 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6qzm\" (UniqueName: \"kubernetes.io/projected/89e52ee4-247d-402e-9c42-8f39e8529314-kube-api-access-h6qzm\") pod \"metallb-operator-controller-manager-6dfdbf684f-xmtsv\" (UID: \"89e52ee4-247d-402e-9c42-8f39e8529314\") " pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.269826 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89e52ee4-247d-402e-9c42-8f39e8529314-webhook-cert\") pod \"metallb-operator-controller-manager-6dfdbf684f-xmtsv\" (UID: \"89e52ee4-247d-402e-9c42-8f39e8529314\") " pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.277061 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89e52ee4-247d-402e-9c42-8f39e8529314-apiservice-cert\") pod \"metallb-operator-controller-manager-6dfdbf684f-xmtsv\" (UID: \"89e52ee4-247d-402e-9c42-8f39e8529314\") " pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.283564 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89e52ee4-247d-402e-9c42-8f39e8529314-webhook-cert\") pod \"metallb-operator-controller-manager-6dfdbf684f-xmtsv\" (UID: \"89e52ee4-247d-402e-9c42-8f39e8529314\") " pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.313048 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6qzm\" (UniqueName: \"kubernetes.io/projected/89e52ee4-247d-402e-9c42-8f39e8529314-kube-api-access-h6qzm\") pod \"metallb-operator-controller-manager-6dfdbf684f-xmtsv\" (UID: \"89e52ee4-247d-402e-9c42-8f39e8529314\") " pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.397760 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.528417 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd"] Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.530349 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.536663 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.538204 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-ccdng" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.538332 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.562368 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd"] Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.682630 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/008f37e0-a6cb-4202-aed6-fa2b3734e881-apiservice-cert\") pod \"metallb-operator-webhook-server-785f7fb488-nl5qd\" (UID: \"008f37e0-a6cb-4202-aed6-fa2b3734e881\") " pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.682935 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/008f37e0-a6cb-4202-aed6-fa2b3734e881-webhook-cert\") pod \"metallb-operator-webhook-server-785f7fb488-nl5qd\" (UID: \"008f37e0-a6cb-4202-aed6-fa2b3734e881\") " pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.683031 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cptpn\" (UniqueName: \"kubernetes.io/projected/008f37e0-a6cb-4202-aed6-fa2b3734e881-kube-api-access-cptpn\") pod \"metallb-operator-webhook-server-785f7fb488-nl5qd\" (UID: \"008f37e0-a6cb-4202-aed6-fa2b3734e881\") " pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.784466 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/008f37e0-a6cb-4202-aed6-fa2b3734e881-apiservice-cert\") pod \"metallb-operator-webhook-server-785f7fb488-nl5qd\" (UID: \"008f37e0-a6cb-4202-aed6-fa2b3734e881\") " pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.784534 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/008f37e0-a6cb-4202-aed6-fa2b3734e881-webhook-cert\") pod \"metallb-operator-webhook-server-785f7fb488-nl5qd\" (UID: \"008f37e0-a6cb-4202-aed6-fa2b3734e881\") " pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.784572 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cptpn\" (UniqueName: \"kubernetes.io/projected/008f37e0-a6cb-4202-aed6-fa2b3734e881-kube-api-access-cptpn\") pod \"metallb-operator-webhook-server-785f7fb488-nl5qd\" (UID: \"008f37e0-a6cb-4202-aed6-fa2b3734e881\") " pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.798257 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/008f37e0-a6cb-4202-aed6-fa2b3734e881-apiservice-cert\") pod \"metallb-operator-webhook-server-785f7fb488-nl5qd\" (UID: \"008f37e0-a6cb-4202-aed6-fa2b3734e881\") " pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.798883 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/008f37e0-a6cb-4202-aed6-fa2b3734e881-webhook-cert\") pod \"metallb-operator-webhook-server-785f7fb488-nl5qd\" (UID: \"008f37e0-a6cb-4202-aed6-fa2b3734e881\") " pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.807098 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cptpn\" (UniqueName: \"kubernetes.io/projected/008f37e0-a6cb-4202-aed6-fa2b3734e881-kube-api-access-cptpn\") pod \"metallb-operator-webhook-server-785f7fb488-nl5qd\" (UID: \"008f37e0-a6cb-4202-aed6-fa2b3734e881\") " pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.864201 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" Nov 29 14:45:55 crc kubenswrapper[4907]: I1129 14:45:55.948465 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv"] Nov 29 14:45:55 crc kubenswrapper[4907]: W1129 14:45:55.968016 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89e52ee4_247d_402e_9c42_8f39e8529314.slice/crio-65a9c33389a004e6a3758d19e3ce57664f40a377a278638d0865d857ad8099e4 WatchSource:0}: Error finding container 65a9c33389a004e6a3758d19e3ce57664f40a377a278638d0865d857ad8099e4: Status 404 returned error can't find the container with id 65a9c33389a004e6a3758d19e3ce57664f40a377a278638d0865d857ad8099e4 Nov 29 14:45:56 crc kubenswrapper[4907]: I1129 14:45:56.343318 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd"] Nov 29 14:45:56 crc kubenswrapper[4907]: W1129 14:45:56.348699 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod008f37e0_a6cb_4202_aed6_fa2b3734e881.slice/crio-dcfb66b8be2b62447656dc3474488a87ef226d545af6c68d05882e080d378ca1 WatchSource:0}: Error finding container dcfb66b8be2b62447656dc3474488a87ef226d545af6c68d05882e080d378ca1: Status 404 returned error can't find the container with id dcfb66b8be2b62447656dc3474488a87ef226d545af6c68d05882e080d378ca1 Nov 29 14:45:56 crc kubenswrapper[4907]: I1129 14:45:56.575185 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" event={"ID":"89e52ee4-247d-402e-9c42-8f39e8529314","Type":"ContainerStarted","Data":"65a9c33389a004e6a3758d19e3ce57664f40a377a278638d0865d857ad8099e4"} Nov 29 14:45:56 crc kubenswrapper[4907]: I1129 14:45:56.577249 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" event={"ID":"008f37e0-a6cb-4202-aed6-fa2b3734e881","Type":"ContainerStarted","Data":"dcfb66b8be2b62447656dc3474488a87ef226d545af6c68d05882e080d378ca1"} Nov 29 14:46:00 crc kubenswrapper[4907]: I1129 14:46:00.642868 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" event={"ID":"89e52ee4-247d-402e-9c42-8f39e8529314","Type":"ContainerStarted","Data":"8ab8dacd261067e7ef0acd74e5fae38c2ad949b57770cbb83b4d9e1066ddef19"} Nov 29 14:46:00 crc kubenswrapper[4907]: I1129 14:46:00.643393 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" Nov 29 14:46:02 crc kubenswrapper[4907]: I1129 14:46:02.541208 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" podStartSLOduration=3.80309922 podStartE2EDuration="7.541182583s" podCreationTimestamp="2025-11-29 14:45:55 +0000 UTC" firstStartedPulling="2025-11-29 14:45:55.97616568 +0000 UTC m=+1053.963003332" lastFinishedPulling="2025-11-29 14:45:59.714249043 +0000 UTC m=+1057.701086695" observedRunningTime="2025-11-29 14:46:00.668900631 +0000 UTC m=+1058.655738283" watchObservedRunningTime="2025-11-29 14:46:02.541182583 +0000 UTC m=+1060.528020245" Nov 29 14:46:02 crc kubenswrapper[4907]: I1129 14:46:02.671732 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" event={"ID":"008f37e0-a6cb-4202-aed6-fa2b3734e881","Type":"ContainerStarted","Data":"057bf60b0093533282e9621e87ef3f611007553df06a133a48950d5a4eb0dfb5"} Nov 29 14:46:02 crc kubenswrapper[4907]: I1129 14:46:02.672908 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" Nov 29 14:46:02 crc kubenswrapper[4907]: I1129 14:46:02.692529 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" podStartSLOduration=2.477936029 podStartE2EDuration="7.692507017s" podCreationTimestamp="2025-11-29 14:45:55 +0000 UTC" firstStartedPulling="2025-11-29 14:45:56.352131045 +0000 UTC m=+1054.338968707" lastFinishedPulling="2025-11-29 14:46:01.566702053 +0000 UTC m=+1059.553539695" observedRunningTime="2025-11-29 14:46:02.689946206 +0000 UTC m=+1060.676783868" watchObservedRunningTime="2025-11-29 14:46:02.692507017 +0000 UTC m=+1060.679344669" Nov 29 14:46:15 crc kubenswrapper[4907]: I1129 14:46:15.869784 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-785f7fb488-nl5qd" Nov 29 14:46:35 crc kubenswrapper[4907]: I1129 14:46:35.402363 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6dfdbf684f-xmtsv" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.335329 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-84ncb"] Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.339595 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.341947 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-gqbxp" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.342086 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.343917 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.389738 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz"] Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.391005 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.393790 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.427587 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz"] Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.458407 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/df794960-249c-4965-814c-36decf5db5d3-reloader\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.458491 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/df794960-249c-4965-814c-36decf5db5d3-frr-sockets\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.458529 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fcq9\" (UniqueName: \"kubernetes.io/projected/749d56ce-a6c4-4b8f-bd45-0f8a44a9d192-kube-api-access-9fcq9\") pod \"frr-k8s-webhook-server-7fcb986d4-tdzhz\" (UID: \"749d56ce-a6c4-4b8f-bd45-0f8a44a9d192\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.458552 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/749d56ce-a6c4-4b8f-bd45-0f8a44a9d192-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-tdzhz\" (UID: \"749d56ce-a6c4-4b8f-bd45-0f8a44a9d192\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.458570 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/df794960-249c-4965-814c-36decf5db5d3-frr-conf\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.458589 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df794960-249c-4965-814c-36decf5db5d3-metrics-certs\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.458611 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/df794960-249c-4965-814c-36decf5db5d3-frr-startup\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.458631 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/df794960-249c-4965-814c-36decf5db5d3-metrics\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.458674 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkgjs\" (UniqueName: \"kubernetes.io/projected/df794960-249c-4965-814c-36decf5db5d3-kube-api-access-rkgjs\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.494231 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-csdjw"] Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.495711 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-csdjw" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.499631 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.499902 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.500154 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-kxznv" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.500370 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.517110 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-bkm82"] Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.518830 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-bkm82" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.520954 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.536040 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-bkm82"] Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560379 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-metrics-certs\") pod \"speaker-csdjw\" (UID: \"a0198f8f-d4b9-4452-abda-d3e0df0ec26d\") " pod="metallb-system/speaker-csdjw" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560429 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/df794960-249c-4965-814c-36decf5db5d3-reloader\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560477 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-metallb-excludel2\") pod \"speaker-csdjw\" (UID: \"a0198f8f-d4b9-4452-abda-d3e0df0ec26d\") " pod="metallb-system/speaker-csdjw" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560509 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncqh8\" (UniqueName: \"kubernetes.io/projected/a13cb44c-0bae-4a00-9f98-ad5c6f3c6660-kube-api-access-ncqh8\") pod \"controller-f8648f98b-bkm82\" (UID: \"a13cb44c-0bae-4a00-9f98-ad5c6f3c6660\") " pod="metallb-system/controller-f8648f98b-bkm82" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560530 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/df794960-249c-4965-814c-36decf5db5d3-frr-sockets\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560558 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkkwn\" (UniqueName: \"kubernetes.io/projected/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-kube-api-access-mkkwn\") pod \"speaker-csdjw\" (UID: \"a0198f8f-d4b9-4452-abda-d3e0df0ec26d\") " pod="metallb-system/speaker-csdjw" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560578 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fcq9\" (UniqueName: \"kubernetes.io/projected/749d56ce-a6c4-4b8f-bd45-0f8a44a9d192-kube-api-access-9fcq9\") pod \"frr-k8s-webhook-server-7fcb986d4-tdzhz\" (UID: \"749d56ce-a6c4-4b8f-bd45-0f8a44a9d192\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560598 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/749d56ce-a6c4-4b8f-bd45-0f8a44a9d192-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-tdzhz\" (UID: \"749d56ce-a6c4-4b8f-bd45-0f8a44a9d192\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560664 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/df794960-249c-4965-814c-36decf5db5d3-frr-conf\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560712 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df794960-249c-4965-814c-36decf5db5d3-metrics-certs\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560752 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/df794960-249c-4965-814c-36decf5db5d3-frr-startup\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560791 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/df794960-249c-4965-814c-36decf5db5d3-metrics\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560872 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a13cb44c-0bae-4a00-9f98-ad5c6f3c6660-cert\") pod \"controller-f8648f98b-bkm82\" (UID: \"a13cb44c-0bae-4a00-9f98-ad5c6f3c6660\") " pod="metallb-system/controller-f8648f98b-bkm82" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560914 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-memberlist\") pod \"speaker-csdjw\" (UID: \"a0198f8f-d4b9-4452-abda-d3e0df0ec26d\") " pod="metallb-system/speaker-csdjw" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560954 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/df794960-249c-4965-814c-36decf5db5d3-reloader\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.560961 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkgjs\" (UniqueName: \"kubernetes.io/projected/df794960-249c-4965-814c-36decf5db5d3-kube-api-access-rkgjs\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.561121 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a13cb44c-0bae-4a00-9f98-ad5c6f3c6660-metrics-certs\") pod \"controller-f8648f98b-bkm82\" (UID: \"a13cb44c-0bae-4a00-9f98-ad5c6f3c6660\") " pod="metallb-system/controller-f8648f98b-bkm82" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.561536 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/df794960-249c-4965-814c-36decf5db5d3-frr-sockets\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.561541 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/df794960-249c-4965-814c-36decf5db5d3-metrics\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.561710 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/df794960-249c-4965-814c-36decf5db5d3-frr-conf\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.562616 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/df794960-249c-4965-814c-36decf5db5d3-frr-startup\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.578127 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fcq9\" (UniqueName: \"kubernetes.io/projected/749d56ce-a6c4-4b8f-bd45-0f8a44a9d192-kube-api-access-9fcq9\") pod \"frr-k8s-webhook-server-7fcb986d4-tdzhz\" (UID: \"749d56ce-a6c4-4b8f-bd45-0f8a44a9d192\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.580019 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkgjs\" (UniqueName: \"kubernetes.io/projected/df794960-249c-4965-814c-36decf5db5d3-kube-api-access-rkgjs\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.581144 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/749d56ce-a6c4-4b8f-bd45-0f8a44a9d192-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-tdzhz\" (UID: \"749d56ce-a6c4-4b8f-bd45-0f8a44a9d192\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.581361 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/df794960-249c-4965-814c-36decf5db5d3-metrics-certs\") pod \"frr-k8s-84ncb\" (UID: \"df794960-249c-4965-814c-36decf5db5d3\") " pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.661525 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.662541 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a13cb44c-0bae-4a00-9f98-ad5c6f3c6660-cert\") pod \"controller-f8648f98b-bkm82\" (UID: \"a13cb44c-0bae-4a00-9f98-ad5c6f3c6660\") " pod="metallb-system/controller-f8648f98b-bkm82" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.662604 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-memberlist\") pod \"speaker-csdjw\" (UID: \"a0198f8f-d4b9-4452-abda-d3e0df0ec26d\") " pod="metallb-system/speaker-csdjw" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.662653 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a13cb44c-0bae-4a00-9f98-ad5c6f3c6660-metrics-certs\") pod \"controller-f8648f98b-bkm82\" (UID: \"a13cb44c-0bae-4a00-9f98-ad5c6f3c6660\") " pod="metallb-system/controller-f8648f98b-bkm82" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.662688 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-metrics-certs\") pod \"speaker-csdjw\" (UID: \"a0198f8f-d4b9-4452-abda-d3e0df0ec26d\") " pod="metallb-system/speaker-csdjw" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.662710 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-metallb-excludel2\") pod \"speaker-csdjw\" (UID: \"a0198f8f-d4b9-4452-abda-d3e0df0ec26d\") " pod="metallb-system/speaker-csdjw" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.662738 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncqh8\" (UniqueName: \"kubernetes.io/projected/a13cb44c-0bae-4a00-9f98-ad5c6f3c6660-kube-api-access-ncqh8\") pod \"controller-f8648f98b-bkm82\" (UID: \"a13cb44c-0bae-4a00-9f98-ad5c6f3c6660\") " pod="metallb-system/controller-f8648f98b-bkm82" Nov 29 14:46:36 crc kubenswrapper[4907]: E1129 14:46:36.662760 4907 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 29 14:46:36 crc kubenswrapper[4907]: E1129 14:46:36.662823 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-memberlist podName:a0198f8f-d4b9-4452-abda-d3e0df0ec26d nodeName:}" failed. No retries permitted until 2025-11-29 14:46:37.16280193 +0000 UTC m=+1095.149639592 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-memberlist") pod "speaker-csdjw" (UID: "a0198f8f-d4b9-4452-abda-d3e0df0ec26d") : secret "metallb-memberlist" not found Nov 29 14:46:36 crc kubenswrapper[4907]: E1129 14:46:36.662846 4907 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Nov 29 14:46:36 crc kubenswrapper[4907]: E1129 14:46:36.662936 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-metrics-certs podName:a0198f8f-d4b9-4452-abda-d3e0df0ec26d nodeName:}" failed. No retries permitted until 2025-11-29 14:46:37.162906213 +0000 UTC m=+1095.149743905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-metrics-certs") pod "speaker-csdjw" (UID: "a0198f8f-d4b9-4452-abda-d3e0df0ec26d") : secret "speaker-certs-secret" not found Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.662768 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkkwn\" (UniqueName: \"kubernetes.io/projected/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-kube-api-access-mkkwn\") pod \"speaker-csdjw\" (UID: \"a0198f8f-d4b9-4452-abda-d3e0df0ec26d\") " pod="metallb-system/speaker-csdjw" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.663744 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-metallb-excludel2\") pod \"speaker-csdjw\" (UID: \"a0198f8f-d4b9-4452-abda-d3e0df0ec26d\") " pod="metallb-system/speaker-csdjw" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.664433 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.667668 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a13cb44c-0bae-4a00-9f98-ad5c6f3c6660-metrics-certs\") pod \"controller-f8648f98b-bkm82\" (UID: \"a13cb44c-0bae-4a00-9f98-ad5c6f3c6660\") " pod="metallb-system/controller-f8648f98b-bkm82" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.679833 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkkwn\" (UniqueName: \"kubernetes.io/projected/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-kube-api-access-mkkwn\") pod \"speaker-csdjw\" (UID: \"a0198f8f-d4b9-4452-abda-d3e0df0ec26d\") " pod="metallb-system/speaker-csdjw" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.688215 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a13cb44c-0bae-4a00-9f98-ad5c6f3c6660-cert\") pod \"controller-f8648f98b-bkm82\" (UID: \"a13cb44c-0bae-4a00-9f98-ad5c6f3c6660\") " pod="metallb-system/controller-f8648f98b-bkm82" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.698720 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncqh8\" (UniqueName: \"kubernetes.io/projected/a13cb44c-0bae-4a00-9f98-ad5c6f3c6660-kube-api-access-ncqh8\") pod \"controller-f8648f98b-bkm82\" (UID: \"a13cb44c-0bae-4a00-9f98-ad5c6f3c6660\") " pod="metallb-system/controller-f8648f98b-bkm82" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.737201 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.844882 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-bkm82" Nov 29 14:46:36 crc kubenswrapper[4907]: I1129 14:46:36.978401 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-84ncb" event={"ID":"df794960-249c-4965-814c-36decf5db5d3","Type":"ContainerStarted","Data":"f702277c0e049a5af90ec2a96992f94abbdce251db1b9f12d24c6db36d49e694"} Nov 29 14:46:37 crc kubenswrapper[4907]: I1129 14:46:37.171957 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-memberlist\") pod \"speaker-csdjw\" (UID: \"a0198f8f-d4b9-4452-abda-d3e0df0ec26d\") " pod="metallb-system/speaker-csdjw" Nov 29 14:46:37 crc kubenswrapper[4907]: I1129 14:46:37.172393 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-metrics-certs\") pod \"speaker-csdjw\" (UID: \"a0198f8f-d4b9-4452-abda-d3e0df0ec26d\") " pod="metallb-system/speaker-csdjw" Nov 29 14:46:37 crc kubenswrapper[4907]: E1129 14:46:37.172218 4907 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 29 14:46:37 crc kubenswrapper[4907]: E1129 14:46:37.173005 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-memberlist podName:a0198f8f-d4b9-4452-abda-d3e0df0ec26d nodeName:}" failed. No retries permitted until 2025-11-29 14:46:38.172962091 +0000 UTC m=+1096.159799783 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-memberlist") pod "speaker-csdjw" (UID: "a0198f8f-d4b9-4452-abda-d3e0df0ec26d") : secret "metallb-memberlist" not found Nov 29 14:46:37 crc kubenswrapper[4907]: I1129 14:46:37.190372 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz"] Nov 29 14:46:37 crc kubenswrapper[4907]: I1129 14:46:37.191631 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-metrics-certs\") pod \"speaker-csdjw\" (UID: \"a0198f8f-d4b9-4452-abda-d3e0df0ec26d\") " pod="metallb-system/speaker-csdjw" Nov 29 14:46:37 crc kubenswrapper[4907]: I1129 14:46:37.320406 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-bkm82"] Nov 29 14:46:37 crc kubenswrapper[4907]: I1129 14:46:37.988169 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-bkm82" event={"ID":"a13cb44c-0bae-4a00-9f98-ad5c6f3c6660","Type":"ContainerStarted","Data":"4222654e3f19af98969a1ccd1938f4081814eac57cd4fc580130238773879e71"} Nov 29 14:46:37 crc kubenswrapper[4907]: I1129 14:46:37.988525 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-bkm82" event={"ID":"a13cb44c-0bae-4a00-9f98-ad5c6f3c6660","Type":"ContainerStarted","Data":"6dba179753a70c9183562a1346defef9cc34393ec3115d4c8020aaa1d57cf352"} Nov 29 14:46:37 crc kubenswrapper[4907]: I1129 14:46:37.988539 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-bkm82" event={"ID":"a13cb44c-0bae-4a00-9f98-ad5c6f3c6660","Type":"ContainerStarted","Data":"364b0ad7ee731259a0406517379bee4c81de1a68a387cf56d2855d45a82725d4"} Nov 29 14:46:37 crc kubenswrapper[4907]: I1129 14:46:37.988564 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-bkm82" Nov 29 14:46:37 crc kubenswrapper[4907]: I1129 14:46:37.989572 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz" event={"ID":"749d56ce-a6c4-4b8f-bd45-0f8a44a9d192","Type":"ContainerStarted","Data":"a35a182fbba1f031f14c8e805e3fe8e6727a9dce94aa0ba60697f36097fac4a8"} Nov 29 14:46:38 crc kubenswrapper[4907]: I1129 14:46:38.012914 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-bkm82" podStartSLOduration=2.012888955 podStartE2EDuration="2.012888955s" podCreationTimestamp="2025-11-29 14:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:46:38.006281661 +0000 UTC m=+1095.993119313" watchObservedRunningTime="2025-11-29 14:46:38.012888955 +0000 UTC m=+1095.999726607" Nov 29 14:46:38 crc kubenswrapper[4907]: I1129 14:46:38.192224 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-memberlist\") pod \"speaker-csdjw\" (UID: \"a0198f8f-d4b9-4452-abda-d3e0df0ec26d\") " pod="metallb-system/speaker-csdjw" Nov 29 14:46:38 crc kubenswrapper[4907]: I1129 14:46:38.200229 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a0198f8f-d4b9-4452-abda-d3e0df0ec26d-memberlist\") pod \"speaker-csdjw\" (UID: \"a0198f8f-d4b9-4452-abda-d3e0df0ec26d\") " pod="metallb-system/speaker-csdjw" Nov 29 14:46:38 crc kubenswrapper[4907]: I1129 14:46:38.313818 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-csdjw" Nov 29 14:46:38 crc kubenswrapper[4907]: W1129 14:46:38.349189 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0198f8f_d4b9_4452_abda_d3e0df0ec26d.slice/crio-8388d3f311eddf352abfeb51c53bb4efac05fccef0232c1b723d7594702da7d6 WatchSource:0}: Error finding container 8388d3f311eddf352abfeb51c53bb4efac05fccef0232c1b723d7594702da7d6: Status 404 returned error can't find the container with id 8388d3f311eddf352abfeb51c53bb4efac05fccef0232c1b723d7594702da7d6 Nov 29 14:46:39 crc kubenswrapper[4907]: I1129 14:46:38.999875 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-csdjw" event={"ID":"a0198f8f-d4b9-4452-abda-d3e0df0ec26d","Type":"ContainerStarted","Data":"657d912b4af6dc38e1321718640898f62e738dfce4d3bb071684da0f75707a44"} Nov 29 14:46:39 crc kubenswrapper[4907]: I1129 14:46:39.000183 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-csdjw" event={"ID":"a0198f8f-d4b9-4452-abda-d3e0df0ec26d","Type":"ContainerStarted","Data":"8388d3f311eddf352abfeb51c53bb4efac05fccef0232c1b723d7594702da7d6"} Nov 29 14:46:40 crc kubenswrapper[4907]: I1129 14:46:40.025891 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-csdjw" event={"ID":"a0198f8f-d4b9-4452-abda-d3e0df0ec26d","Type":"ContainerStarted","Data":"8ecf022132689106a9723445d35aa917d7cb2c2d5dc56f29ec8d12091e011f6e"} Nov 29 14:46:40 crc kubenswrapper[4907]: I1129 14:46:40.026312 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-csdjw" Nov 29 14:46:40 crc kubenswrapper[4907]: I1129 14:46:40.055097 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-csdjw" podStartSLOduration=4.055072941 podStartE2EDuration="4.055072941s" podCreationTimestamp="2025-11-29 14:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:46:40.046318336 +0000 UTC m=+1098.033155988" watchObservedRunningTime="2025-11-29 14:46:40.055072941 +0000 UTC m=+1098.041910593" Nov 29 14:46:46 crc kubenswrapper[4907]: I1129 14:46:46.081385 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz" event={"ID":"749d56ce-a6c4-4b8f-bd45-0f8a44a9d192","Type":"ContainerStarted","Data":"1eb8ff35ee5e6c7cd3e6aa526b5d0c3988070bf436273d9a9ea44615881e738f"} Nov 29 14:46:46 crc kubenswrapper[4907]: I1129 14:46:46.083467 4907 generic.go:334] "Generic (PLEG): container finished" podID="df794960-249c-4965-814c-36decf5db5d3" containerID="fd2248b430bb0adb04d08592a775005f4d682a21e02e63a50b6477994fda4115" exitCode=0 Nov 29 14:46:46 crc kubenswrapper[4907]: I1129 14:46:46.083618 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-84ncb" event={"ID":"df794960-249c-4965-814c-36decf5db5d3","Type":"ContainerDied","Data":"fd2248b430bb0adb04d08592a775005f4d682a21e02e63a50b6477994fda4115"} Nov 29 14:46:46 crc kubenswrapper[4907]: I1129 14:46:46.083969 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz" Nov 29 14:46:46 crc kubenswrapper[4907]: I1129 14:46:46.108879 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz" podStartSLOduration=2.345234283 podStartE2EDuration="10.108859063s" podCreationTimestamp="2025-11-29 14:46:36 +0000 UTC" firstStartedPulling="2025-11-29 14:46:37.212188236 +0000 UTC m=+1095.199025888" lastFinishedPulling="2025-11-29 14:46:44.975813016 +0000 UTC m=+1102.962650668" observedRunningTime="2025-11-29 14:46:46.105813898 +0000 UTC m=+1104.092651570" watchObservedRunningTime="2025-11-29 14:46:46.108859063 +0000 UTC m=+1104.095696725" Nov 29 14:46:47 crc kubenswrapper[4907]: I1129 14:46:47.091711 4907 generic.go:334] "Generic (PLEG): container finished" podID="df794960-249c-4965-814c-36decf5db5d3" containerID="3b6bc5a4d596df1ba8b2be721df73102b9bf5bfe9e631e159de5b0df8a727aae" exitCode=0 Nov 29 14:46:47 crc kubenswrapper[4907]: I1129 14:46:47.091903 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-84ncb" event={"ID":"df794960-249c-4965-814c-36decf5db5d3","Type":"ContainerDied","Data":"3b6bc5a4d596df1ba8b2be721df73102b9bf5bfe9e631e159de5b0df8a727aae"} Nov 29 14:46:48 crc kubenswrapper[4907]: I1129 14:46:48.103607 4907 generic.go:334] "Generic (PLEG): container finished" podID="df794960-249c-4965-814c-36decf5db5d3" containerID="3916d056fd091b3ea0ee0497ab57649502ad6f44cdd0ac9825274dd0c41b8376" exitCode=0 Nov 29 14:46:48 crc kubenswrapper[4907]: I1129 14:46:48.103760 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-84ncb" event={"ID":"df794960-249c-4965-814c-36decf5db5d3","Type":"ContainerDied","Data":"3916d056fd091b3ea0ee0497ab57649502ad6f44cdd0ac9825274dd0c41b8376"} Nov 29 14:46:48 crc kubenswrapper[4907]: I1129 14:46:48.321653 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-csdjw" Nov 29 14:46:49 crc kubenswrapper[4907]: I1129 14:46:49.117846 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-84ncb" event={"ID":"df794960-249c-4965-814c-36decf5db5d3","Type":"ContainerStarted","Data":"f09d5cc907d5cb48982682bcc57f17916efb19243a4dfe690080f2fdc18dec85"} Nov 29 14:46:49 crc kubenswrapper[4907]: I1129 14:46:49.118152 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-84ncb" event={"ID":"df794960-249c-4965-814c-36decf5db5d3","Type":"ContainerStarted","Data":"8b6db6d33d9319d78db36bcc4b69e19c92eb9945d6741de8518655ac8e9ac0ed"} Nov 29 14:46:49 crc kubenswrapper[4907]: I1129 14:46:49.118163 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-84ncb" event={"ID":"df794960-249c-4965-814c-36decf5db5d3","Type":"ContainerStarted","Data":"272ad81b2ec0c9347befc2f48161ebbac0144f802df682faa6e9a35c55460cb9"} Nov 29 14:46:49 crc kubenswrapper[4907]: I1129 14:46:49.118173 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-84ncb" event={"ID":"df794960-249c-4965-814c-36decf5db5d3","Type":"ContainerStarted","Data":"cd1eba124a88f162b95c981fc3c3a7521a54953db0d88748786d22d7461938a9"} Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.150515 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-84ncb" event={"ID":"df794960-249c-4965-814c-36decf5db5d3","Type":"ContainerStarted","Data":"d904aaf92505dba0dec1237669d23ebfc8bbc63ca80036ac3c79d7457495dd63"} Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.151153 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.151171 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-84ncb" event={"ID":"df794960-249c-4965-814c-36decf5db5d3","Type":"ContainerStarted","Data":"acd0334e45e27dc46e4df6222b964575da687c6b94681eecec5455a223364d7f"} Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.191806 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-84ncb" podStartSLOduration=7.116165007 podStartE2EDuration="15.191786616s" podCreationTimestamp="2025-11-29 14:46:36 +0000 UTC" firstStartedPulling="2025-11-29 14:46:36.865803147 +0000 UTC m=+1094.852640799" lastFinishedPulling="2025-11-29 14:46:44.941424716 +0000 UTC m=+1102.928262408" observedRunningTime="2025-11-29 14:46:51.187271518 +0000 UTC m=+1109.174109180" watchObservedRunningTime="2025-11-29 14:46:51.191786616 +0000 UTC m=+1109.178624278" Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.319351 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-brcht"] Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.320710 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-brcht" Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.323059 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-4x6l9" Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.323106 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.325860 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.455256 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-brcht"] Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.456366 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8rd4\" (UniqueName: \"kubernetes.io/projected/369dd0d0-d291-4997-9d69-1122a999c573-kube-api-access-z8rd4\") pod \"openstack-operator-index-brcht\" (UID: \"369dd0d0-d291-4997-9d69-1122a999c573\") " pod="openstack-operators/openstack-operator-index-brcht" Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.557814 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8rd4\" (UniqueName: \"kubernetes.io/projected/369dd0d0-d291-4997-9d69-1122a999c573-kube-api-access-z8rd4\") pod \"openstack-operator-index-brcht\" (UID: \"369dd0d0-d291-4997-9d69-1122a999c573\") " pod="openstack-operators/openstack-operator-index-brcht" Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.588296 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8rd4\" (UniqueName: \"kubernetes.io/projected/369dd0d0-d291-4997-9d69-1122a999c573-kube-api-access-z8rd4\") pod \"openstack-operator-index-brcht\" (UID: \"369dd0d0-d291-4997-9d69-1122a999c573\") " pod="openstack-operators/openstack-operator-index-brcht" Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.661757 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.678804 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-brcht" Nov 29 14:46:51 crc kubenswrapper[4907]: I1129 14:46:51.702970 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-84ncb" Nov 29 14:46:52 crc kubenswrapper[4907]: W1129 14:46:52.128453 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod369dd0d0_d291_4997_9d69_1122a999c573.slice/crio-7f36fece701d2ccbfa550a6e7c839661782ae84e955c4467d6c2f6ad213e17d5 WatchSource:0}: Error finding container 7f36fece701d2ccbfa550a6e7c839661782ae84e955c4467d6c2f6ad213e17d5: Status 404 returned error can't find the container with id 7f36fece701d2ccbfa550a6e7c839661782ae84e955c4467d6c2f6ad213e17d5 Nov 29 14:46:52 crc kubenswrapper[4907]: I1129 14:46:52.133589 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-brcht"] Nov 29 14:46:52 crc kubenswrapper[4907]: I1129 14:46:52.164695 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-brcht" event={"ID":"369dd0d0-d291-4997-9d69-1122a999c573","Type":"ContainerStarted","Data":"7f36fece701d2ccbfa550a6e7c839661782ae84e955c4467d6c2f6ad213e17d5"} Nov 29 14:46:54 crc kubenswrapper[4907]: I1129 14:46:54.691249 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-brcht"] Nov 29 14:46:55 crc kubenswrapper[4907]: I1129 14:46:55.195287 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-brcht" event={"ID":"369dd0d0-d291-4997-9d69-1122a999c573","Type":"ContainerStarted","Data":"78e0b6cfb846ae56152a8362419afbab55d51d684190d6a1b15a98b35be85844"} Nov 29 14:46:55 crc kubenswrapper[4907]: I1129 14:46:55.195497 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-brcht" podUID="369dd0d0-d291-4997-9d69-1122a999c573" containerName="registry-server" containerID="cri-o://78e0b6cfb846ae56152a8362419afbab55d51d684190d6a1b15a98b35be85844" gracePeriod=2 Nov 29 14:46:55 crc kubenswrapper[4907]: I1129 14:46:55.230793 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-brcht" podStartSLOduration=1.51685868 podStartE2EDuration="4.230766457s" podCreationTimestamp="2025-11-29 14:46:51 +0000 UTC" firstStartedPulling="2025-11-29 14:46:52.135602225 +0000 UTC m=+1110.122439877" lastFinishedPulling="2025-11-29 14:46:54.849510002 +0000 UTC m=+1112.836347654" observedRunningTime="2025-11-29 14:46:55.221539416 +0000 UTC m=+1113.208377078" watchObservedRunningTime="2025-11-29 14:46:55.230766457 +0000 UTC m=+1113.217604119" Nov 29 14:46:55 crc kubenswrapper[4907]: I1129 14:46:55.307830 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-2m2ns"] Nov 29 14:46:55 crc kubenswrapper[4907]: I1129 14:46:55.309180 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2m2ns" Nov 29 14:46:55 crc kubenswrapper[4907]: I1129 14:46:55.316730 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2m2ns"] Nov 29 14:46:55 crc kubenswrapper[4907]: I1129 14:46:55.439386 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpjmw\" (UniqueName: \"kubernetes.io/projected/ca80acab-472d-46fd-97c1-f432ddf7bb64-kube-api-access-bpjmw\") pod \"openstack-operator-index-2m2ns\" (UID: \"ca80acab-472d-46fd-97c1-f432ddf7bb64\") " pod="openstack-operators/openstack-operator-index-2m2ns" Nov 29 14:46:55 crc kubenswrapper[4907]: I1129 14:46:55.541784 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpjmw\" (UniqueName: \"kubernetes.io/projected/ca80acab-472d-46fd-97c1-f432ddf7bb64-kube-api-access-bpjmw\") pod \"openstack-operator-index-2m2ns\" (UID: \"ca80acab-472d-46fd-97c1-f432ddf7bb64\") " pod="openstack-operators/openstack-operator-index-2m2ns" Nov 29 14:46:55 crc kubenswrapper[4907]: I1129 14:46:55.574041 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpjmw\" (UniqueName: \"kubernetes.io/projected/ca80acab-472d-46fd-97c1-f432ddf7bb64-kube-api-access-bpjmw\") pod \"openstack-operator-index-2m2ns\" (UID: \"ca80acab-472d-46fd-97c1-f432ddf7bb64\") " pod="openstack-operators/openstack-operator-index-2m2ns" Nov 29 14:46:55 crc kubenswrapper[4907]: I1129 14:46:55.630497 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-brcht" Nov 29 14:46:55 crc kubenswrapper[4907]: I1129 14:46:55.662073 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2m2ns" Nov 29 14:46:55 crc kubenswrapper[4907]: I1129 14:46:55.745356 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8rd4\" (UniqueName: \"kubernetes.io/projected/369dd0d0-d291-4997-9d69-1122a999c573-kube-api-access-z8rd4\") pod \"369dd0d0-d291-4997-9d69-1122a999c573\" (UID: \"369dd0d0-d291-4997-9d69-1122a999c573\") " Nov 29 14:46:55 crc kubenswrapper[4907]: I1129 14:46:55.749899 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/369dd0d0-d291-4997-9d69-1122a999c573-kube-api-access-z8rd4" (OuterVolumeSpecName: "kube-api-access-z8rd4") pod "369dd0d0-d291-4997-9d69-1122a999c573" (UID: "369dd0d0-d291-4997-9d69-1122a999c573"). InnerVolumeSpecName "kube-api-access-z8rd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:46:55 crc kubenswrapper[4907]: I1129 14:46:55.848168 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8rd4\" (UniqueName: \"kubernetes.io/projected/369dd0d0-d291-4997-9d69-1122a999c573-kube-api-access-z8rd4\") on node \"crc\" DevicePath \"\"" Nov 29 14:46:56 crc kubenswrapper[4907]: I1129 14:46:56.096423 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2m2ns"] Nov 29 14:46:56 crc kubenswrapper[4907]: I1129 14:46:56.210780 4907 generic.go:334] "Generic (PLEG): container finished" podID="369dd0d0-d291-4997-9d69-1122a999c573" containerID="78e0b6cfb846ae56152a8362419afbab55d51d684190d6a1b15a98b35be85844" exitCode=0 Nov 29 14:46:56 crc kubenswrapper[4907]: I1129 14:46:56.210879 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-brcht" event={"ID":"369dd0d0-d291-4997-9d69-1122a999c573","Type":"ContainerDied","Data":"78e0b6cfb846ae56152a8362419afbab55d51d684190d6a1b15a98b35be85844"} Nov 29 14:46:56 crc kubenswrapper[4907]: I1129 14:46:56.210925 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-brcht" event={"ID":"369dd0d0-d291-4997-9d69-1122a999c573","Type":"ContainerDied","Data":"7f36fece701d2ccbfa550a6e7c839661782ae84e955c4467d6c2f6ad213e17d5"} Nov 29 14:46:56 crc kubenswrapper[4907]: I1129 14:46:56.210964 4907 scope.go:117] "RemoveContainer" containerID="78e0b6cfb846ae56152a8362419afbab55d51d684190d6a1b15a98b35be85844" Nov 29 14:46:56 crc kubenswrapper[4907]: I1129 14:46:56.211153 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-brcht" Nov 29 14:46:56 crc kubenswrapper[4907]: I1129 14:46:56.216431 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2m2ns" event={"ID":"ca80acab-472d-46fd-97c1-f432ddf7bb64","Type":"ContainerStarted","Data":"d190e4c90a565b55d13a14a5e14e6b127e5f08a17753fa4b5cd77b4ca3385cb8"} Nov 29 14:46:56 crc kubenswrapper[4907]: I1129 14:46:56.272584 4907 scope.go:117] "RemoveContainer" containerID="78e0b6cfb846ae56152a8362419afbab55d51d684190d6a1b15a98b35be85844" Nov 29 14:46:56 crc kubenswrapper[4907]: E1129 14:46:56.273136 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78e0b6cfb846ae56152a8362419afbab55d51d684190d6a1b15a98b35be85844\": container with ID starting with 78e0b6cfb846ae56152a8362419afbab55d51d684190d6a1b15a98b35be85844 not found: ID does not exist" containerID="78e0b6cfb846ae56152a8362419afbab55d51d684190d6a1b15a98b35be85844" Nov 29 14:46:56 crc kubenswrapper[4907]: I1129 14:46:56.273199 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78e0b6cfb846ae56152a8362419afbab55d51d684190d6a1b15a98b35be85844"} err="failed to get container status \"78e0b6cfb846ae56152a8362419afbab55d51d684190d6a1b15a98b35be85844\": rpc error: code = NotFound desc = could not find container \"78e0b6cfb846ae56152a8362419afbab55d51d684190d6a1b15a98b35be85844\": container with ID starting with 78e0b6cfb846ae56152a8362419afbab55d51d684190d6a1b15a98b35be85844 not found: ID does not exist" Nov 29 14:46:56 crc kubenswrapper[4907]: I1129 14:46:56.292063 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-brcht"] Nov 29 14:46:56 crc kubenswrapper[4907]: I1129 14:46:56.302821 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-brcht"] Nov 29 14:46:56 crc kubenswrapper[4907]: I1129 14:46:56.490754 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="369dd0d0-d291-4997-9d69-1122a999c573" path="/var/lib/kubelet/pods/369dd0d0-d291-4997-9d69-1122a999c573/volumes" Nov 29 14:46:56 crc kubenswrapper[4907]: I1129 14:46:56.742023 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdzhz" Nov 29 14:46:56 crc kubenswrapper[4907]: I1129 14:46:56.850078 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-bkm82" Nov 29 14:46:57 crc kubenswrapper[4907]: I1129 14:46:57.234794 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2m2ns" event={"ID":"ca80acab-472d-46fd-97c1-f432ddf7bb64","Type":"ContainerStarted","Data":"6475648ad93a44a88624d6a23149d8ffcaf6a818562989855bacf05f5f845487"} Nov 29 14:46:57 crc kubenswrapper[4907]: I1129 14:46:57.255630 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-2m2ns" podStartSLOduration=2.200194353 podStartE2EDuration="2.255602754s" podCreationTimestamp="2025-11-29 14:46:55 +0000 UTC" firstStartedPulling="2025-11-29 14:46:56.114576205 +0000 UTC m=+1114.101413857" lastFinishedPulling="2025-11-29 14:46:56.169984566 +0000 UTC m=+1114.156822258" observedRunningTime="2025-11-29 14:46:57.255122881 +0000 UTC m=+1115.241960583" watchObservedRunningTime="2025-11-29 14:46:57.255602754 +0000 UTC m=+1115.242440446" Nov 29 14:46:58 crc kubenswrapper[4907]: I1129 14:46:58.490794 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:46:58 crc kubenswrapper[4907]: I1129 14:46:58.490899 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:47:05 crc kubenswrapper[4907]: I1129 14:47:05.662937 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-2m2ns" Nov 29 14:47:05 crc kubenswrapper[4907]: I1129 14:47:05.663337 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-2m2ns" Nov 29 14:47:05 crc kubenswrapper[4907]: I1129 14:47:05.716143 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-2m2ns" Nov 29 14:47:06 crc kubenswrapper[4907]: I1129 14:47:06.360315 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-2m2ns" Nov 29 14:47:06 crc kubenswrapper[4907]: I1129 14:47:06.665828 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-84ncb" Nov 29 14:47:13 crc kubenswrapper[4907]: I1129 14:47:13.754342 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s"] Nov 29 14:47:13 crc kubenswrapper[4907]: E1129 14:47:13.756330 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369dd0d0-d291-4997-9d69-1122a999c573" containerName="registry-server" Nov 29 14:47:13 crc kubenswrapper[4907]: I1129 14:47:13.756389 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="369dd0d0-d291-4997-9d69-1122a999c573" containerName="registry-server" Nov 29 14:47:13 crc kubenswrapper[4907]: I1129 14:47:13.756740 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="369dd0d0-d291-4997-9d69-1122a999c573" containerName="registry-server" Nov 29 14:47:13 crc kubenswrapper[4907]: I1129 14:47:13.758842 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" Nov 29 14:47:13 crc kubenswrapper[4907]: I1129 14:47:13.762976 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-rx85c" Nov 29 14:47:13 crc kubenswrapper[4907]: I1129 14:47:13.768711 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s"] Nov 29 14:47:13 crc kubenswrapper[4907]: I1129 14:47:13.887873 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6t6x\" (UniqueName: \"kubernetes.io/projected/089ab608-2dbf-489d-bcc8-cb61ab4564b4-kube-api-access-w6t6x\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s\" (UID: \"089ab608-2dbf-489d-bcc8-cb61ab4564b4\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" Nov 29 14:47:13 crc kubenswrapper[4907]: I1129 14:47:13.887974 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/089ab608-2dbf-489d-bcc8-cb61ab4564b4-util\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s\" (UID: \"089ab608-2dbf-489d-bcc8-cb61ab4564b4\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" Nov 29 14:47:13 crc kubenswrapper[4907]: I1129 14:47:13.888891 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/089ab608-2dbf-489d-bcc8-cb61ab4564b4-bundle\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s\" (UID: \"089ab608-2dbf-489d-bcc8-cb61ab4564b4\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" Nov 29 14:47:13 crc kubenswrapper[4907]: I1129 14:47:13.992037 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6t6x\" (UniqueName: \"kubernetes.io/projected/089ab608-2dbf-489d-bcc8-cb61ab4564b4-kube-api-access-w6t6x\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s\" (UID: \"089ab608-2dbf-489d-bcc8-cb61ab4564b4\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" Nov 29 14:47:13 crc kubenswrapper[4907]: I1129 14:47:13.992206 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/089ab608-2dbf-489d-bcc8-cb61ab4564b4-util\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s\" (UID: \"089ab608-2dbf-489d-bcc8-cb61ab4564b4\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" Nov 29 14:47:13 crc kubenswrapper[4907]: I1129 14:47:13.992296 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/089ab608-2dbf-489d-bcc8-cb61ab4564b4-bundle\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s\" (UID: \"089ab608-2dbf-489d-bcc8-cb61ab4564b4\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" Nov 29 14:47:13 crc kubenswrapper[4907]: I1129 14:47:13.993273 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/089ab608-2dbf-489d-bcc8-cb61ab4564b4-bundle\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s\" (UID: \"089ab608-2dbf-489d-bcc8-cb61ab4564b4\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" Nov 29 14:47:13 crc kubenswrapper[4907]: I1129 14:47:13.993524 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/089ab608-2dbf-489d-bcc8-cb61ab4564b4-util\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s\" (UID: \"089ab608-2dbf-489d-bcc8-cb61ab4564b4\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" Nov 29 14:47:14 crc kubenswrapper[4907]: I1129 14:47:14.036815 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6t6x\" (UniqueName: \"kubernetes.io/projected/089ab608-2dbf-489d-bcc8-cb61ab4564b4-kube-api-access-w6t6x\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s\" (UID: \"089ab608-2dbf-489d-bcc8-cb61ab4564b4\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" Nov 29 14:47:14 crc kubenswrapper[4907]: I1129 14:47:14.087554 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" Nov 29 14:47:14 crc kubenswrapper[4907]: W1129 14:47:14.635255 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod089ab608_2dbf_489d_bcc8_cb61ab4564b4.slice/crio-272c52c5d411580b0ea5538e2cc82997721402ecf9b7af160bd97e90da87897e WatchSource:0}: Error finding container 272c52c5d411580b0ea5538e2cc82997721402ecf9b7af160bd97e90da87897e: Status 404 returned error can't find the container with id 272c52c5d411580b0ea5538e2cc82997721402ecf9b7af160bd97e90da87897e Nov 29 14:47:14 crc kubenswrapper[4907]: I1129 14:47:14.637623 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s"] Nov 29 14:47:15 crc kubenswrapper[4907]: I1129 14:47:15.413418 4907 generic.go:334] "Generic (PLEG): container finished" podID="089ab608-2dbf-489d-bcc8-cb61ab4564b4" containerID="6fcaac4aaa470fc99d288c2578b6d60fb675ab4442b7b9e4c6de747556263a1d" exitCode=0 Nov 29 14:47:15 crc kubenswrapper[4907]: I1129 14:47:15.413808 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" event={"ID":"089ab608-2dbf-489d-bcc8-cb61ab4564b4","Type":"ContainerDied","Data":"6fcaac4aaa470fc99d288c2578b6d60fb675ab4442b7b9e4c6de747556263a1d"} Nov 29 14:47:15 crc kubenswrapper[4907]: I1129 14:47:15.413840 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" event={"ID":"089ab608-2dbf-489d-bcc8-cb61ab4564b4","Type":"ContainerStarted","Data":"272c52c5d411580b0ea5538e2cc82997721402ecf9b7af160bd97e90da87897e"} Nov 29 14:47:16 crc kubenswrapper[4907]: I1129 14:47:16.427068 4907 generic.go:334] "Generic (PLEG): container finished" podID="089ab608-2dbf-489d-bcc8-cb61ab4564b4" containerID="1839c1f08e24d1bd6d865aa74168b696e40c61fa460d367dc79d4bed67ee9aa6" exitCode=0 Nov 29 14:47:16 crc kubenswrapper[4907]: I1129 14:47:16.427200 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" event={"ID":"089ab608-2dbf-489d-bcc8-cb61ab4564b4","Type":"ContainerDied","Data":"1839c1f08e24d1bd6d865aa74168b696e40c61fa460d367dc79d4bed67ee9aa6"} Nov 29 14:47:17 crc kubenswrapper[4907]: I1129 14:47:17.442199 4907 generic.go:334] "Generic (PLEG): container finished" podID="089ab608-2dbf-489d-bcc8-cb61ab4564b4" containerID="4b5b2f1b6080755a05fbfd4882063b0213c2e9297da166c5645b5d003e8e4ca1" exitCode=0 Nov 29 14:47:17 crc kubenswrapper[4907]: I1129 14:47:17.442472 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" event={"ID":"089ab608-2dbf-489d-bcc8-cb61ab4564b4","Type":"ContainerDied","Data":"4b5b2f1b6080755a05fbfd4882063b0213c2e9297da166c5645b5d003e8e4ca1"} Nov 29 14:47:18 crc kubenswrapper[4907]: I1129 14:47:18.772709 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" Nov 29 14:47:18 crc kubenswrapper[4907]: I1129 14:47:18.882598 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/089ab608-2dbf-489d-bcc8-cb61ab4564b4-util\") pod \"089ab608-2dbf-489d-bcc8-cb61ab4564b4\" (UID: \"089ab608-2dbf-489d-bcc8-cb61ab4564b4\") " Nov 29 14:47:18 crc kubenswrapper[4907]: I1129 14:47:18.882706 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6t6x\" (UniqueName: \"kubernetes.io/projected/089ab608-2dbf-489d-bcc8-cb61ab4564b4-kube-api-access-w6t6x\") pod \"089ab608-2dbf-489d-bcc8-cb61ab4564b4\" (UID: \"089ab608-2dbf-489d-bcc8-cb61ab4564b4\") " Nov 29 14:47:18 crc kubenswrapper[4907]: I1129 14:47:18.882895 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/089ab608-2dbf-489d-bcc8-cb61ab4564b4-bundle\") pod \"089ab608-2dbf-489d-bcc8-cb61ab4564b4\" (UID: \"089ab608-2dbf-489d-bcc8-cb61ab4564b4\") " Nov 29 14:47:18 crc kubenswrapper[4907]: I1129 14:47:18.884053 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/089ab608-2dbf-489d-bcc8-cb61ab4564b4-bundle" (OuterVolumeSpecName: "bundle") pod "089ab608-2dbf-489d-bcc8-cb61ab4564b4" (UID: "089ab608-2dbf-489d-bcc8-cb61ab4564b4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:47:18 crc kubenswrapper[4907]: I1129 14:47:18.891578 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/089ab608-2dbf-489d-bcc8-cb61ab4564b4-kube-api-access-w6t6x" (OuterVolumeSpecName: "kube-api-access-w6t6x") pod "089ab608-2dbf-489d-bcc8-cb61ab4564b4" (UID: "089ab608-2dbf-489d-bcc8-cb61ab4564b4"). InnerVolumeSpecName "kube-api-access-w6t6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:47:18 crc kubenswrapper[4907]: I1129 14:47:18.905199 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/089ab608-2dbf-489d-bcc8-cb61ab4564b4-util" (OuterVolumeSpecName: "util") pod "089ab608-2dbf-489d-bcc8-cb61ab4564b4" (UID: "089ab608-2dbf-489d-bcc8-cb61ab4564b4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:47:18 crc kubenswrapper[4907]: I1129 14:47:18.984966 4907 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/089ab608-2dbf-489d-bcc8-cb61ab4564b4-util\") on node \"crc\" DevicePath \"\"" Nov 29 14:47:18 crc kubenswrapper[4907]: I1129 14:47:18.984996 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6t6x\" (UniqueName: \"kubernetes.io/projected/089ab608-2dbf-489d-bcc8-cb61ab4564b4-kube-api-access-w6t6x\") on node \"crc\" DevicePath \"\"" Nov 29 14:47:18 crc kubenswrapper[4907]: I1129 14:47:18.985009 4907 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/089ab608-2dbf-489d-bcc8-cb61ab4564b4-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:47:19 crc kubenswrapper[4907]: I1129 14:47:19.462391 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" event={"ID":"089ab608-2dbf-489d-bcc8-cb61ab4564b4","Type":"ContainerDied","Data":"272c52c5d411580b0ea5538e2cc82997721402ecf9b7af160bd97e90da87897e"} Nov 29 14:47:19 crc kubenswrapper[4907]: I1129 14:47:19.462930 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="272c52c5d411580b0ea5538e2cc82997721402ecf9b7af160bd97e90da87897e" Nov 29 14:47:19 crc kubenswrapper[4907]: I1129 14:47:19.462537 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s" Nov 29 14:47:21 crc kubenswrapper[4907]: I1129 14:47:21.714076 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-95b97cc44-vpslq"] Nov 29 14:47:21 crc kubenswrapper[4907]: E1129 14:47:21.714754 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089ab608-2dbf-489d-bcc8-cb61ab4564b4" containerName="util" Nov 29 14:47:21 crc kubenswrapper[4907]: I1129 14:47:21.714771 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="089ab608-2dbf-489d-bcc8-cb61ab4564b4" containerName="util" Nov 29 14:47:21 crc kubenswrapper[4907]: E1129 14:47:21.714806 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089ab608-2dbf-489d-bcc8-cb61ab4564b4" containerName="extract" Nov 29 14:47:21 crc kubenswrapper[4907]: I1129 14:47:21.714814 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="089ab608-2dbf-489d-bcc8-cb61ab4564b4" containerName="extract" Nov 29 14:47:21 crc kubenswrapper[4907]: E1129 14:47:21.714841 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089ab608-2dbf-489d-bcc8-cb61ab4564b4" containerName="pull" Nov 29 14:47:21 crc kubenswrapper[4907]: I1129 14:47:21.714851 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="089ab608-2dbf-489d-bcc8-cb61ab4564b4" containerName="pull" Nov 29 14:47:21 crc kubenswrapper[4907]: I1129 14:47:21.715039 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="089ab608-2dbf-489d-bcc8-cb61ab4564b4" containerName="extract" Nov 29 14:47:21 crc kubenswrapper[4907]: I1129 14:47:21.715773 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-95b97cc44-vpslq" Nov 29 14:47:21 crc kubenswrapper[4907]: I1129 14:47:21.718712 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-2sdnp" Nov 29 14:47:21 crc kubenswrapper[4907]: I1129 14:47:21.760553 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-95b97cc44-vpslq"] Nov 29 14:47:21 crc kubenswrapper[4907]: I1129 14:47:21.836455 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5x5w\" (UniqueName: \"kubernetes.io/projected/bd26c3e2-de76-4342-91c6-9ee4571f8619-kube-api-access-r5x5w\") pod \"openstack-operator-controller-operator-95b97cc44-vpslq\" (UID: \"bd26c3e2-de76-4342-91c6-9ee4571f8619\") " pod="openstack-operators/openstack-operator-controller-operator-95b97cc44-vpslq" Nov 29 14:47:21 crc kubenswrapper[4907]: I1129 14:47:21.938041 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5x5w\" (UniqueName: \"kubernetes.io/projected/bd26c3e2-de76-4342-91c6-9ee4571f8619-kube-api-access-r5x5w\") pod \"openstack-operator-controller-operator-95b97cc44-vpslq\" (UID: \"bd26c3e2-de76-4342-91c6-9ee4571f8619\") " pod="openstack-operators/openstack-operator-controller-operator-95b97cc44-vpslq" Nov 29 14:47:21 crc kubenswrapper[4907]: I1129 14:47:21.958545 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5x5w\" (UniqueName: \"kubernetes.io/projected/bd26c3e2-de76-4342-91c6-9ee4571f8619-kube-api-access-r5x5w\") pod \"openstack-operator-controller-operator-95b97cc44-vpslq\" (UID: \"bd26c3e2-de76-4342-91c6-9ee4571f8619\") " pod="openstack-operators/openstack-operator-controller-operator-95b97cc44-vpslq" Nov 29 14:47:22 crc kubenswrapper[4907]: I1129 14:47:22.036322 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-95b97cc44-vpslq" Nov 29 14:47:22 crc kubenswrapper[4907]: I1129 14:47:22.563533 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-95b97cc44-vpslq"] Nov 29 14:47:23 crc kubenswrapper[4907]: I1129 14:47:23.501238 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-95b97cc44-vpslq" event={"ID":"bd26c3e2-de76-4342-91c6-9ee4571f8619","Type":"ContainerStarted","Data":"a9cb0520b6a99a58ec0e6ba632bbd40abe6811026c174b6983d12c088bb0504b"} Nov 29 14:47:28 crc kubenswrapper[4907]: I1129 14:47:28.490048 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:47:28 crc kubenswrapper[4907]: I1129 14:47:28.490790 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:47:28 crc kubenswrapper[4907]: I1129 14:47:28.563000 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-95b97cc44-vpslq" event={"ID":"bd26c3e2-de76-4342-91c6-9ee4571f8619","Type":"ContainerStarted","Data":"fce315194cd9472e906ea5d1b7599f7f88c79b1c5d74726b3f24b9fbd498d8ce"} Nov 29 14:47:28 crc kubenswrapper[4907]: I1129 14:47:28.563749 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-95b97cc44-vpslq" Nov 29 14:47:28 crc kubenswrapper[4907]: I1129 14:47:28.598785 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-95b97cc44-vpslq" podStartSLOduration=2.79129306 podStartE2EDuration="7.598768652s" podCreationTimestamp="2025-11-29 14:47:21 +0000 UTC" firstStartedPulling="2025-11-29 14:47:22.56800067 +0000 UTC m=+1140.554838322" lastFinishedPulling="2025-11-29 14:47:27.375476262 +0000 UTC m=+1145.362313914" observedRunningTime="2025-11-29 14:47:28.588943503 +0000 UTC m=+1146.575781195" watchObservedRunningTime="2025-11-29 14:47:28.598768652 +0000 UTC m=+1146.585606304" Nov 29 14:47:32 crc kubenswrapper[4907]: I1129 14:47:32.041740 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-95b97cc44-vpslq" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.077127 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-b8f6h"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.078795 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-b8f6h" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.081296 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-jxgrn" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.090066 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-b8f6h"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.103683 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-kqbx2"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.105027 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kqbx2" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.106602 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-4wtrz" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.109991 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-brs2h"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.111251 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-brs2h" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.118242 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-f8k2b" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.145672 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-28hrq"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.145916 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk6kd\" (UniqueName: \"kubernetes.io/projected/cdcaa6fe-2208-49d5-82d8-9b2c96be251d-kube-api-access-bk6kd\") pod \"designate-operator-controller-manager-78b4bc895b-brs2h\" (UID: \"cdcaa6fe-2208-49d5-82d8-9b2c96be251d\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-brs2h" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.145977 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n5sg\" (UniqueName: \"kubernetes.io/projected/71e0b5bc-68d6-434d-97c4-0c6d3a324e15-kube-api-access-2n5sg\") pod \"cinder-operator-controller-manager-859b6ccc6-kqbx2\" (UID: \"71e0b5bc-68d6-434d-97c4-0c6d3a324e15\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kqbx2" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.146036 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn76v\" (UniqueName: \"kubernetes.io/projected/958f375f-e7a8-4d96-b2a1-dc5a63cdc865-kube-api-access-xn76v\") pod \"barbican-operator-controller-manager-7d9dfd778-b8f6h\" (UID: \"958f375f-e7a8-4d96-b2a1-dc5a63cdc865\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-b8f6h" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.147138 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-28hrq" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.151601 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-4z8wb" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.167427 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-kqbx2"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.185474 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-brs2h"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.254378 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk6kd\" (UniqueName: \"kubernetes.io/projected/cdcaa6fe-2208-49d5-82d8-9b2c96be251d-kube-api-access-bk6kd\") pod \"designate-operator-controller-manager-78b4bc895b-brs2h\" (UID: \"cdcaa6fe-2208-49d5-82d8-9b2c96be251d\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-brs2h" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.254449 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n5sg\" (UniqueName: \"kubernetes.io/projected/71e0b5bc-68d6-434d-97c4-0c6d3a324e15-kube-api-access-2n5sg\") pod \"cinder-operator-controller-manager-859b6ccc6-kqbx2\" (UID: \"71e0b5bc-68d6-434d-97c4-0c6d3a324e15\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kqbx2" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.254496 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn76v\" (UniqueName: \"kubernetes.io/projected/958f375f-e7a8-4d96-b2a1-dc5a63cdc865-kube-api-access-xn76v\") pod \"barbican-operator-controller-manager-7d9dfd778-b8f6h\" (UID: \"958f375f-e7a8-4d96-b2a1-dc5a63cdc865\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-b8f6h" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.254572 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkq5q\" (UniqueName: \"kubernetes.io/projected/aca0ecce-183f-40cd-8ab0-aed5caf29556-kube-api-access-pkq5q\") pod \"glance-operator-controller-manager-668d9c48b9-28hrq\" (UID: \"aca0ecce-183f-40cd-8ab0-aed5caf29556\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-28hrq" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.257028 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-28hrq"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.307166 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n5sg\" (UniqueName: \"kubernetes.io/projected/71e0b5bc-68d6-434d-97c4-0c6d3a324e15-kube-api-access-2n5sg\") pod \"cinder-operator-controller-manager-859b6ccc6-kqbx2\" (UID: \"71e0b5bc-68d6-434d-97c4-0c6d3a324e15\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kqbx2" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.312358 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn76v\" (UniqueName: \"kubernetes.io/projected/958f375f-e7a8-4d96-b2a1-dc5a63cdc865-kube-api-access-xn76v\") pod \"barbican-operator-controller-manager-7d9dfd778-b8f6h\" (UID: \"958f375f-e7a8-4d96-b2a1-dc5a63cdc865\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-b8f6h" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.312991 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk6kd\" (UniqueName: \"kubernetes.io/projected/cdcaa6fe-2208-49d5-82d8-9b2c96be251d-kube-api-access-bk6kd\") pod \"designate-operator-controller-manager-78b4bc895b-brs2h\" (UID: \"cdcaa6fe-2208-49d5-82d8-9b2c96be251d\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-brs2h" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.328035 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-28vdp"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.330327 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-28vdp" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.341856 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-kvzgj" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.386419 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkb8r\" (UniqueName: \"kubernetes.io/projected/bbc59bc4-78c2-4534-b1bd-93cf4b60f86e-kube-api-access-hkb8r\") pod \"heat-operator-controller-manager-5f64f6f8bb-28vdp\" (UID: \"bbc59bc4-78c2-4534-b1bd-93cf4b60f86e\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-28vdp" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.386553 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkq5q\" (UniqueName: \"kubernetes.io/projected/aca0ecce-183f-40cd-8ab0-aed5caf29556-kube-api-access-pkq5q\") pod \"glance-operator-controller-manager-668d9c48b9-28hrq\" (UID: \"aca0ecce-183f-40cd-8ab0-aed5caf29556\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-28hrq" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.391305 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wspkj"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.407829 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-b8f6h" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.414633 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wspkj" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.430208 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkq5q\" (UniqueName: \"kubernetes.io/projected/aca0ecce-183f-40cd-8ab0-aed5caf29556-kube-api-access-pkq5q\") pod \"glance-operator-controller-manager-668d9c48b9-28hrq\" (UID: \"aca0ecce-183f-40cd-8ab0-aed5caf29556\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-28hrq" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.436861 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-j9kmg" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.459263 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kqbx2" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.478640 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-28vdp"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.485811 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-brs2h" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.492612 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjctp\" (UniqueName: \"kubernetes.io/projected/cf7efbf1-79c8-45f9-8bed-7a33f47226ef-kube-api-access-jjctp\") pod \"horizon-operator-controller-manager-68c6d99b8f-wspkj\" (UID: \"cf7efbf1-79c8-45f9-8bed-7a33f47226ef\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wspkj" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.492802 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkb8r\" (UniqueName: \"kubernetes.io/projected/bbc59bc4-78c2-4534-b1bd-93cf4b60f86e-kube-api-access-hkb8r\") pod \"heat-operator-controller-manager-5f64f6f8bb-28vdp\" (UID: \"bbc59bc4-78c2-4534-b1bd-93cf4b60f86e\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-28vdp" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.509546 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-28hrq" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.550149 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wspkj"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.551104 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkb8r\" (UniqueName: \"kubernetes.io/projected/bbc59bc4-78c2-4534-b1bd-93cf4b60f86e-kube-api-access-hkb8r\") pod \"heat-operator-controller-manager-5f64f6f8bb-28vdp\" (UID: \"bbc59bc4-78c2-4534-b1bd-93cf4b60f86e\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-28vdp" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.569513 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-z92f7"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.570982 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.577128 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-rdqvj" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.577276 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.605365 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjctp\" (UniqueName: \"kubernetes.io/projected/cf7efbf1-79c8-45f9-8bed-7a33f47226ef-kube-api-access-jjctp\") pod \"horizon-operator-controller-manager-68c6d99b8f-wspkj\" (UID: \"cf7efbf1-79c8-45f9-8bed-7a33f47226ef\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wspkj" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.690699 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-xqw5l"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.692273 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-xqw5l" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.710199 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7jdw\" (UniqueName: \"kubernetes.io/projected/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-kube-api-access-r7jdw\") pod \"infra-operator-controller-manager-57548d458d-z92f7\" (UID: \"f9c64e5e-531f-4f11-b7d5-e22ed46b9b86\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.710259 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert\") pod \"infra-operator-controller-manager-57548d458d-z92f7\" (UID: \"f9c64e5e-531f-4f11-b7d5-e22ed46b9b86\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.711214 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-lvvvv" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.714348 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-28vdp" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.729498 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-z92f7"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.760562 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjctp\" (UniqueName: \"kubernetes.io/projected/cf7efbf1-79c8-45f9-8bed-7a33f47226ef-kube-api-access-jjctp\") pod \"horizon-operator-controller-manager-68c6d99b8f-wspkj\" (UID: \"cf7efbf1-79c8-45f9-8bed-7a33f47226ef\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wspkj" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.783812 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-xqw5l"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.811495 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4mpk\" (UniqueName: \"kubernetes.io/projected/e12e8dfe-6b2d-49f4-90d1-3165ec08f043-kube-api-access-b4mpk\") pod \"ironic-operator-controller-manager-6c548fd776-xqw5l\" (UID: \"e12e8dfe-6b2d-49f4-90d1-3165ec08f043\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-xqw5l" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.811814 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7jdw\" (UniqueName: \"kubernetes.io/projected/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-kube-api-access-r7jdw\") pod \"infra-operator-controller-manager-57548d458d-z92f7\" (UID: \"f9c64e5e-531f-4f11-b7d5-e22ed46b9b86\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.811844 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert\") pod \"infra-operator-controller-manager-57548d458d-z92f7\" (UID: \"f9c64e5e-531f-4f11-b7d5-e22ed46b9b86\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" Nov 29 14:47:51 crc kubenswrapper[4907]: E1129 14:47:51.812024 4907 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 14:47:51 crc kubenswrapper[4907]: E1129 14:47:51.812075 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert podName:f9c64e5e-531f-4f11-b7d5-e22ed46b9b86 nodeName:}" failed. No retries permitted until 2025-11-29 14:47:52.312056045 +0000 UTC m=+1170.298893697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert") pod "infra-operator-controller-manager-57548d458d-z92f7" (UID: "f9c64e5e-531f-4f11-b7d5-e22ed46b9b86") : secret "infra-operator-webhook-server-cert" not found Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.819143 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-kqq5w"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.820536 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-kqq5w" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.823499 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-kqq5w"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.827311 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-ptcwg" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.840583 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-zspmk"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.841862 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zspmk" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.842485 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wspkj" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.843198 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7jdw\" (UniqueName: \"kubernetes.io/projected/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-kube-api-access-r7jdw\") pod \"infra-operator-controller-manager-57548d458d-z92f7\" (UID: \"f9c64e5e-531f-4f11-b7d5-e22ed46b9b86\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.843465 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2qnzr" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.852477 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-zspmk"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.866494 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4dlgt"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.867960 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4dlgt" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.877127 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jm57w"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.879178 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jm57w" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.884878 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-29xrb" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.892073 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-4g4n7" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.910401 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4dlgt"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.914009 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k8fr\" (UniqueName: \"kubernetes.io/projected/46a74794-f3b0-4bf0-9c94-0920441fd3ce-kube-api-access-6k8fr\") pod \"manila-operator-controller-manager-6546668bfd-zspmk\" (UID: \"46a74794-f3b0-4bf0-9c94-0920441fd3ce\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zspmk" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.914060 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk7v9\" (UniqueName: \"kubernetes.io/projected/2b9ca9f5-7979-47ef-9e37-f9c519b57445-kube-api-access-lk7v9\") pod \"mariadb-operator-controller-manager-56bbcc9d85-4dlgt\" (UID: \"2b9ca9f5-7979-47ef-9e37-f9c519b57445\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4dlgt" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.914119 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq2sr\" (UniqueName: \"kubernetes.io/projected/103b9723-75c0-41ca-8264-41912d22a5cb-kube-api-access-vq2sr\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-jm57w\" (UID: \"103b9723-75c0-41ca-8264-41912d22a5cb\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jm57w" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.914145 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7q6p\" (UniqueName: \"kubernetes.io/projected/d9c5b591-4e0f-4f9e-930d-070798fccb44-kube-api-access-d7q6p\") pod \"keystone-operator-controller-manager-546d4bdf48-kqq5w\" (UID: \"d9c5b591-4e0f-4f9e-930d-070798fccb44\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-kqq5w" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.914198 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4mpk\" (UniqueName: \"kubernetes.io/projected/e12e8dfe-6b2d-49f4-90d1-3165ec08f043-kube-api-access-b4mpk\") pod \"ironic-operator-controller-manager-6c548fd776-xqw5l\" (UID: \"e12e8dfe-6b2d-49f4-90d1-3165ec08f043\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-xqw5l" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.926916 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jm57w"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.944492 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-p8qdf"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.945274 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4mpk\" (UniqueName: \"kubernetes.io/projected/e12e8dfe-6b2d-49f4-90d1-3165ec08f043-kube-api-access-b4mpk\") pod \"ironic-operator-controller-manager-6c548fd776-xqw5l\" (UID: \"e12e8dfe-6b2d-49f4-90d1-3165ec08f043\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-xqw5l" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.945754 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p8qdf" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.950185 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-qhggl"] Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.955911 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-t5ljs" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.957118 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qhggl" Nov 29 14:47:51 crc kubenswrapper[4907]: I1129 14:47:51.963818 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-wwrjt" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:51.990267 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-qhggl"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.017995 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq2sr\" (UniqueName: \"kubernetes.io/projected/103b9723-75c0-41ca-8264-41912d22a5cb-kube-api-access-vq2sr\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-jm57w\" (UID: \"103b9723-75c0-41ca-8264-41912d22a5cb\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jm57w" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.018072 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7q6p\" (UniqueName: \"kubernetes.io/projected/d9c5b591-4e0f-4f9e-930d-070798fccb44-kube-api-access-d7q6p\") pod \"keystone-operator-controller-manager-546d4bdf48-kqq5w\" (UID: \"d9c5b591-4e0f-4f9e-930d-070798fccb44\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-kqq5w" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.018148 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcfrn\" (UniqueName: \"kubernetes.io/projected/3cbe9b24-61e0-449f-a91f-289fd9c5de8e-kube-api-access-xcfrn\") pod \"nova-operator-controller-manager-697bc559fc-p8qdf\" (UID: \"3cbe9b24-61e0-449f-a91f-289fd9c5de8e\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p8qdf" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.018239 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k8fr\" (UniqueName: \"kubernetes.io/projected/46a74794-f3b0-4bf0-9c94-0920441fd3ce-kube-api-access-6k8fr\") pod \"manila-operator-controller-manager-6546668bfd-zspmk\" (UID: \"46a74794-f3b0-4bf0-9c94-0920441fd3ce\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zspmk" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.018260 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh286\" (UniqueName: \"kubernetes.io/projected/8c2b8c08-4eb0-4a87-bb1f-87e08ff1aa91-kube-api-access-lh286\") pod \"octavia-operator-controller-manager-998648c74-qhggl\" (UID: \"8c2b8c08-4eb0-4a87-bb1f-87e08ff1aa91\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-qhggl" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.018312 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk7v9\" (UniqueName: \"kubernetes.io/projected/2b9ca9f5-7979-47ef-9e37-f9c519b57445-kube-api-access-lk7v9\") pod \"mariadb-operator-controller-manager-56bbcc9d85-4dlgt\" (UID: \"2b9ca9f5-7979-47ef-9e37-f9c519b57445\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4dlgt" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.023048 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-p8qdf"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.046961 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k8fr\" (UniqueName: \"kubernetes.io/projected/46a74794-f3b0-4bf0-9c94-0920441fd3ce-kube-api-access-6k8fr\") pod \"manila-operator-controller-manager-6546668bfd-zspmk\" (UID: \"46a74794-f3b0-4bf0-9c94-0920441fd3ce\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zspmk" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.050061 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7q6p\" (UniqueName: \"kubernetes.io/projected/d9c5b591-4e0f-4f9e-930d-070798fccb44-kube-api-access-d7q6p\") pod \"keystone-operator-controller-manager-546d4bdf48-kqq5w\" (UID: \"d9c5b591-4e0f-4f9e-930d-070798fccb44\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-kqq5w" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.053108 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk7v9\" (UniqueName: \"kubernetes.io/projected/2b9ca9f5-7979-47ef-9e37-f9c519b57445-kube-api-access-lk7v9\") pod \"mariadb-operator-controller-manager-56bbcc9d85-4dlgt\" (UID: \"2b9ca9f5-7979-47ef-9e37-f9c519b57445\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4dlgt" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.055534 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq2sr\" (UniqueName: \"kubernetes.io/projected/103b9723-75c0-41ca-8264-41912d22a5cb-kube-api-access-vq2sr\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-jm57w\" (UID: \"103b9723-75c0-41ca-8264-41912d22a5cb\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jm57w" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.056242 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.061154 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.065755 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.065880 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-kdwfk" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.088377 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-xqw5l" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.092581 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.093929 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.101655 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-c8tmz" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.103513 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-fdjx8"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.104707 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-fdjx8" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.107235 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-7s2g5" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.111977 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.120299 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh286\" (UniqueName: \"kubernetes.io/projected/8c2b8c08-4eb0-4a87-bb1f-87e08ff1aa91-kube-api-access-lh286\") pod \"octavia-operator-controller-manager-998648c74-qhggl\" (UID: \"8c2b8c08-4eb0-4a87-bb1f-87e08ff1aa91\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-qhggl" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.120419 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd\" (UID: \"711dd79a-4219-43a3-9767-aad244b9c68f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.120460 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwbpv\" (UniqueName: \"kubernetes.io/projected/711dd79a-4219-43a3-9767-aad244b9c68f-kube-api-access-mwbpv\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd\" (UID: \"711dd79a-4219-43a3-9767-aad244b9c68f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.120481 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcfrn\" (UniqueName: \"kubernetes.io/projected/3cbe9b24-61e0-449f-a91f-289fd9c5de8e-kube-api-access-xcfrn\") pod \"nova-operator-controller-manager-697bc559fc-p8qdf\" (UID: \"3cbe9b24-61e0-449f-a91f-289fd9c5de8e\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p8qdf" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.121110 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p88rd\" (UniqueName: \"kubernetes.io/projected/7f331ad4-d753-41f4-82f9-c2bd60806987-kube-api-access-p88rd\") pod \"ovn-operator-controller-manager-b6456fdb6-gs88x\" (UID: \"7f331ad4-d753-41f4-82f9-c2bd60806987\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.130019 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-fdjx8"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.148782 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcfrn\" (UniqueName: \"kubernetes.io/projected/3cbe9b24-61e0-449f-a91f-289fd9c5de8e-kube-api-access-xcfrn\") pod \"nova-operator-controller-manager-697bc559fc-p8qdf\" (UID: \"3cbe9b24-61e0-449f-a91f-289fd9c5de8e\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p8qdf" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.158287 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh286\" (UniqueName: \"kubernetes.io/projected/8c2b8c08-4eb0-4a87-bb1f-87e08ff1aa91-kube-api-access-lh286\") pod \"octavia-operator-controller-manager-998648c74-qhggl\" (UID: \"8c2b8c08-4eb0-4a87-bb1f-87e08ff1aa91\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-qhggl" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.174385 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.177030 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-kqq5w" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.203172 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zspmk" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.220685 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.222286 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.223565 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlrpl\" (UniqueName: \"kubernetes.io/projected/d193cf7e-774f-44b3-ae22-090d09c15ba5-kube-api-access-tlrpl\") pod \"placement-operator-controller-manager-78f8948974-fdjx8\" (UID: \"d193cf7e-774f-44b3-ae22-090d09c15ba5\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-fdjx8" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.223698 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd\" (UID: \"711dd79a-4219-43a3-9767-aad244b9c68f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.223729 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwbpv\" (UniqueName: \"kubernetes.io/projected/711dd79a-4219-43a3-9767-aad244b9c68f-kube-api-access-mwbpv\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd\" (UID: \"711dd79a-4219-43a3-9767-aad244b9c68f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.223861 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p88rd\" (UniqueName: \"kubernetes.io/projected/7f331ad4-d753-41f4-82f9-c2bd60806987-kube-api-access-p88rd\") pod \"ovn-operator-controller-manager-b6456fdb6-gs88x\" (UID: \"7f331ad4-d753-41f4-82f9-c2bd60806987\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x" Nov 29 14:47:52 crc kubenswrapper[4907]: E1129 14:47:52.224563 4907 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 14:47:52 crc kubenswrapper[4907]: E1129 14:47:52.224609 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert podName:711dd79a-4219-43a3-9767-aad244b9c68f nodeName:}" failed. No retries permitted until 2025-11-29 14:47:52.724595977 +0000 UTC m=+1170.711433629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" (UID: "711dd79a-4219-43a3-9767-aad244b9c68f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.226102 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-wvvf5" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.240907 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-h9tkn"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.241497 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p88rd\" (UniqueName: \"kubernetes.io/projected/7f331ad4-d753-41f4-82f9-c2bd60806987-kube-api-access-p88rd\") pod \"ovn-operator-controller-manager-b6456fdb6-gs88x\" (UID: \"7f331ad4-d753-41f4-82f9-c2bd60806987\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.241881 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jm57w" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.245855 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwbpv\" (UniqueName: \"kubernetes.io/projected/711dd79a-4219-43a3-9767-aad244b9c68f-kube-api-access-mwbpv\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd\" (UID: \"711dd79a-4219-43a3-9767-aad244b9c68f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.246923 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-h9tkn" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.249261 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.250887 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-h6wnh" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.256669 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-h9tkn"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.266530 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-6h9pm"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.270052 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6h9pm" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.272201 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-jtw4q" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.282921 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4dlgt" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.305654 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-dpdcs"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.307629 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dpdcs" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.312117 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-gr24x" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.316585 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p8qdf" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.316736 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-6h9pm"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.325840 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcnbg\" (UniqueName: \"kubernetes.io/projected/fd726e7f-5139-4eb6-b18d-24d14682648c-kube-api-access-xcnbg\") pod \"swift-operator-controller-manager-5f8c65bbfc-h9tkn\" (UID: \"fd726e7f-5139-4eb6-b18d-24d14682648c\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-h9tkn" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.325914 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wxjm\" (UniqueName: \"kubernetes.io/projected/dceea103-7394-4d01-9168-0b3f5b49306f-kube-api-access-2wxjm\") pod \"telemetry-operator-controller-manager-86bbb9c7fb-ldhkh\" (UID: \"dceea103-7394-4d01-9168-0b3f5b49306f\") " pod="openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.325943 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlrpl\" (UniqueName: \"kubernetes.io/projected/d193cf7e-774f-44b3-ae22-090d09c15ba5-kube-api-access-tlrpl\") pod \"placement-operator-controller-manager-78f8948974-fdjx8\" (UID: \"d193cf7e-774f-44b3-ae22-090d09c15ba5\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-fdjx8" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.325991 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl6bf\" (UniqueName: \"kubernetes.io/projected/418d0e3c-7354-4e14-b17b-bab93518e78b-kube-api-access-gl6bf\") pod \"test-operator-controller-manager-5854674fcc-6h9pm\" (UID: \"418d0e3c-7354-4e14-b17b-bab93518e78b\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-6h9pm" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.326013 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert\") pod \"infra-operator-controller-manager-57548d458d-z92f7\" (UID: \"f9c64e5e-531f-4f11-b7d5-e22ed46b9b86\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.326042 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtrsm\" (UniqueName: \"kubernetes.io/projected/45db6747-0449-4839-b0ed-07e930579b83-kube-api-access-jtrsm\") pod \"watcher-operator-controller-manager-769dc69bc-dpdcs\" (UID: \"45db6747-0449-4839-b0ed-07e930579b83\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dpdcs" Nov 29 14:47:52 crc kubenswrapper[4907]: E1129 14:47:52.326375 4907 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 14:47:52 crc kubenswrapper[4907]: E1129 14:47:52.326417 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert podName:f9c64e5e-531f-4f11-b7d5-e22ed46b9b86 nodeName:}" failed. No retries permitted until 2025-11-29 14:47:53.326404983 +0000 UTC m=+1171.313242635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert") pod "infra-operator-controller-manager-57548d458d-z92f7" (UID: "f9c64e5e-531f-4f11-b7d5-e22ed46b9b86") : secret "infra-operator-webhook-server-cert" not found Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.328860 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-dpdcs"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.345384 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlrpl\" (UniqueName: \"kubernetes.io/projected/d193cf7e-774f-44b3-ae22-090d09c15ba5-kube-api-access-tlrpl\") pod \"placement-operator-controller-manager-78f8948974-fdjx8\" (UID: \"d193cf7e-774f-44b3-ae22-090d09c15ba5\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-fdjx8" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.349711 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qhggl" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.400925 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.402360 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.407190 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.412952 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.413141 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-k4ndw" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.413372 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.423471 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kd564"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.431505 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.431588 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl6bf\" (UniqueName: \"kubernetes.io/projected/418d0e3c-7354-4e14-b17b-bab93518e78b-kube-api-access-gl6bf\") pod \"test-operator-controller-manager-5854674fcc-6h9pm\" (UID: \"418d0e3c-7354-4e14-b17b-bab93518e78b\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-6h9pm" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.431662 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtrsm\" (UniqueName: \"kubernetes.io/projected/45db6747-0449-4839-b0ed-07e930579b83-kube-api-access-jtrsm\") pod \"watcher-operator-controller-manager-769dc69bc-dpdcs\" (UID: \"45db6747-0449-4839-b0ed-07e930579b83\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dpdcs" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.431768 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsxl9\" (UniqueName: \"kubernetes.io/projected/e1f880b2-0e04-48c5-81ca-0103abd439fe-kube-api-access-jsxl9\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.431793 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcnbg\" (UniqueName: \"kubernetes.io/projected/fd726e7f-5139-4eb6-b18d-24d14682648c-kube-api-access-xcnbg\") pod \"swift-operator-controller-manager-5f8c65bbfc-h9tkn\" (UID: \"fd726e7f-5139-4eb6-b18d-24d14682648c\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-h9tkn" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.431810 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.431874 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wxjm\" (UniqueName: \"kubernetes.io/projected/dceea103-7394-4d01-9168-0b3f5b49306f-kube-api-access-2wxjm\") pod \"telemetry-operator-controller-manager-86bbb9c7fb-ldhkh\" (UID: \"dceea103-7394-4d01-9168-0b3f5b49306f\") " pod="openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.432227 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kd564" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.444176 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-qpkmd" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.448730 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-fdjx8" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.452875 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kd564"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.458037 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wxjm\" (UniqueName: \"kubernetes.io/projected/dceea103-7394-4d01-9168-0b3f5b49306f-kube-api-access-2wxjm\") pod \"telemetry-operator-controller-manager-86bbb9c7fb-ldhkh\" (UID: \"dceea103-7394-4d01-9168-0b3f5b49306f\") " pod="openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.465140 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.493922 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcnbg\" (UniqueName: \"kubernetes.io/projected/fd726e7f-5139-4eb6-b18d-24d14682648c-kube-api-access-xcnbg\") pod \"swift-operator-controller-manager-5f8c65bbfc-h9tkn\" (UID: \"fd726e7f-5139-4eb6-b18d-24d14682648c\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-h9tkn" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.495299 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl6bf\" (UniqueName: \"kubernetes.io/projected/418d0e3c-7354-4e14-b17b-bab93518e78b-kube-api-access-gl6bf\") pod \"test-operator-controller-manager-5854674fcc-6h9pm\" (UID: \"418d0e3c-7354-4e14-b17b-bab93518e78b\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-6h9pm" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.495526 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtrsm\" (UniqueName: \"kubernetes.io/projected/45db6747-0449-4839-b0ed-07e930579b83-kube-api-access-jtrsm\") pod \"watcher-operator-controller-manager-769dc69bc-dpdcs\" (UID: \"45db6747-0449-4839-b0ed-07e930579b83\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dpdcs" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.535257 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.535333 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m869c\" (UniqueName: \"kubernetes.io/projected/ec0d115a-0b4b-4691-b4f4-778ffd7f6219-kube-api-access-m869c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-kd564\" (UID: \"ec0d115a-0b4b-4691-b4f4-778ffd7f6219\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kd564" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.535416 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsxl9\" (UniqueName: \"kubernetes.io/projected/e1f880b2-0e04-48c5-81ca-0103abd439fe-kube-api-access-jsxl9\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:47:52 crc kubenswrapper[4907]: E1129 14:47:52.535425 4907 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 14:47:52 crc kubenswrapper[4907]: E1129 14:47:52.535518 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs podName:e1f880b2-0e04-48c5-81ca-0103abd439fe nodeName:}" failed. No retries permitted until 2025-11-29 14:47:53.035499339 +0000 UTC m=+1171.022336991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs") pod "openstack-operator-controller-manager-f5c5f9868-d5gtz" (UID: "e1f880b2-0e04-48c5-81ca-0103abd439fe") : secret "webhook-server-cert" not found Nov 29 14:47:52 crc kubenswrapper[4907]: E1129 14:47:52.535536 4907 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 29 14:47:52 crc kubenswrapper[4907]: E1129 14:47:52.535581 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs podName:e1f880b2-0e04-48c5-81ca-0103abd439fe nodeName:}" failed. No retries permitted until 2025-11-29 14:47:53.035567251 +0000 UTC m=+1171.022404903 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs") pod "openstack-operator-controller-manager-f5c5f9868-d5gtz" (UID: "e1f880b2-0e04-48c5-81ca-0103abd439fe") : secret "metrics-server-cert" not found Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.535451 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.538403 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6h9pm" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.546406 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-b8f6h"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.547928 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.559024 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dpdcs" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.572243 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-h9tkn" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.576023 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsxl9\" (UniqueName: \"kubernetes.io/projected/e1f880b2-0e04-48c5-81ca-0103abd439fe-kube-api-access-jsxl9\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.647896 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m869c\" (UniqueName: \"kubernetes.io/projected/ec0d115a-0b4b-4691-b4f4-778ffd7f6219-kube-api-access-m869c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-kd564\" (UID: \"ec0d115a-0b4b-4691-b4f4-778ffd7f6219\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kd564" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.715788 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m869c\" (UniqueName: \"kubernetes.io/projected/ec0d115a-0b4b-4691-b4f4-778ffd7f6219-kube-api-access-m869c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-kd564\" (UID: \"ec0d115a-0b4b-4691-b4f4-778ffd7f6219\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kd564" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.753775 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd\" (UID: \"711dd79a-4219-43a3-9767-aad244b9c68f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" Nov 29 14:47:52 crc kubenswrapper[4907]: E1129 14:47:52.753948 4907 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 14:47:52 crc kubenswrapper[4907]: E1129 14:47:52.754001 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert podName:711dd79a-4219-43a3-9767-aad244b9c68f nodeName:}" failed. No retries permitted until 2025-11-29 14:47:53.753984441 +0000 UTC m=+1171.740822093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" (UID: "711dd79a-4219-43a3-9767-aad244b9c68f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.757199 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-brs2h"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.769865 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wspkj"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.785818 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-28hrq"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.827660 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kqbx2" event={"ID":"71e0b5bc-68d6-434d-97c4-0c6d3a324e15","Type":"ContainerStarted","Data":"be1cc8b3071a4c08a871b13d115a8af6582dfec4774c11d39416c59749c916c5"} Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.832760 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-b8f6h" event={"ID":"958f375f-e7a8-4d96-b2a1-dc5a63cdc865","Type":"ContainerStarted","Data":"443e3ab15590965934df2a6d5005a23c3d22a7b013a1b30e1ac4cb3d88ce9f5e"} Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.834557 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-kqbx2"] Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.835855 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wspkj" event={"ID":"cf7efbf1-79c8-45f9-8bed-7a33f47226ef","Type":"ContainerStarted","Data":"2a3842119518999cb363807a020d12af16da2a29a4c68115b00374930f44274b"} Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.849227 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-28hrq" event={"ID":"aca0ecce-183f-40cd-8ab0-aed5caf29556","Type":"ContainerStarted","Data":"c57cb72b98895c2efa4a5713b6c245bbbea44bcd3d3438af8dd3e4d81416adc5"} Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.924643 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kd564" Nov 29 14:47:52 crc kubenswrapper[4907]: I1129 14:47:52.978493 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-28vdp"] Nov 29 14:47:53 crc kubenswrapper[4907]: I1129 14:47:53.058218 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:47:53 crc kubenswrapper[4907]: I1129 14:47:53.058306 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:47:53 crc kubenswrapper[4907]: E1129 14:47:53.058486 4907 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 29 14:47:53 crc kubenswrapper[4907]: E1129 14:47:53.058571 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs podName:e1f880b2-0e04-48c5-81ca-0103abd439fe nodeName:}" failed. No retries permitted until 2025-11-29 14:47:54.058551362 +0000 UTC m=+1172.045389014 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs") pod "openstack-operator-controller-manager-f5c5f9868-d5gtz" (UID: "e1f880b2-0e04-48c5-81ca-0103abd439fe") : secret "metrics-server-cert" not found Nov 29 14:47:53 crc kubenswrapper[4907]: E1129 14:47:53.058490 4907 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 14:47:53 crc kubenswrapper[4907]: E1129 14:47:53.058812 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs podName:e1f880b2-0e04-48c5-81ca-0103abd439fe nodeName:}" failed. No retries permitted until 2025-11-29 14:47:54.058794019 +0000 UTC m=+1172.045631671 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs") pod "openstack-operator-controller-manager-f5c5f9868-d5gtz" (UID: "e1f880b2-0e04-48c5-81ca-0103abd439fe") : secret "webhook-server-cert" not found Nov 29 14:47:53 crc kubenswrapper[4907]: I1129 14:47:53.328997 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jm57w"] Nov 29 14:47:53 crc kubenswrapper[4907]: I1129 14:47:53.347415 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4dlgt"] Nov 29 14:47:53 crc kubenswrapper[4907]: I1129 14:47:53.362834 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert\") pod \"infra-operator-controller-manager-57548d458d-z92f7\" (UID: \"f9c64e5e-531f-4f11-b7d5-e22ed46b9b86\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" Nov 29 14:47:53 crc kubenswrapper[4907]: E1129 14:47:53.363105 4907 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 14:47:53 crc kubenswrapper[4907]: E1129 14:47:53.363150 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert podName:f9c64e5e-531f-4f11-b7d5-e22ed46b9b86 nodeName:}" failed. No retries permitted until 2025-11-29 14:47:55.363134825 +0000 UTC m=+1173.349972477 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert") pod "infra-operator-controller-manager-57548d458d-z92f7" (UID: "f9c64e5e-531f-4f11-b7d5-e22ed46b9b86") : secret "infra-operator-webhook-server-cert" not found Nov 29 14:47:53 crc kubenswrapper[4907]: I1129 14:47:53.565245 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-xqw5l"] Nov 29 14:47:53 crc kubenswrapper[4907]: I1129 14:47:53.577736 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-zspmk"] Nov 29 14:47:53 crc kubenswrapper[4907]: I1129 14:47:53.773798 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd\" (UID: \"711dd79a-4219-43a3-9767-aad244b9c68f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" Nov 29 14:47:53 crc kubenswrapper[4907]: E1129 14:47:53.773945 4907 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 14:47:53 crc kubenswrapper[4907]: E1129 14:47:53.773994 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert podName:711dd79a-4219-43a3-9767-aad244b9c68f nodeName:}" failed. No retries permitted until 2025-11-29 14:47:55.773979619 +0000 UTC m=+1173.760817271 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" (UID: "711dd79a-4219-43a3-9767-aad244b9c68f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 14:47:53 crc kubenswrapper[4907]: I1129 14:47:53.862612 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-xqw5l" event={"ID":"e12e8dfe-6b2d-49f4-90d1-3165ec08f043","Type":"ContainerStarted","Data":"0be8b0eac02bdaa21e2e23489c0440a85f65f1c3c47ba052f0bfb005a91456f2"} Nov 29 14:47:53 crc kubenswrapper[4907]: I1129 14:47:53.863944 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-brs2h" event={"ID":"cdcaa6fe-2208-49d5-82d8-9b2c96be251d","Type":"ContainerStarted","Data":"bbe5e7839708496ff3346e75dd62d96b6393b769ccaf694a0c6b5c3bf5ab6d8d"} Nov 29 14:47:53 crc kubenswrapper[4907]: I1129 14:47:53.866891 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jm57w" event={"ID":"103b9723-75c0-41ca-8264-41912d22a5cb","Type":"ContainerStarted","Data":"7cffe25d24dd54a30ae0309d3ff2bda585e7979171ab66819eee0bf5e267acdb"} Nov 29 14:47:53 crc kubenswrapper[4907]: I1129 14:47:53.867982 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-28vdp" event={"ID":"bbc59bc4-78c2-4534-b1bd-93cf4b60f86e","Type":"ContainerStarted","Data":"09a8b19c1c895381b2f2ec52a093dff6ada35975c105ab52e7dfe92df7fdc6ab"} Nov 29 14:47:53 crc kubenswrapper[4907]: I1129 14:47:53.869156 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zspmk" event={"ID":"46a74794-f3b0-4bf0-9c94-0920441fd3ce","Type":"ContainerStarted","Data":"43d77a5b50fe1d02fc67ec142dd1feaf6e3cd80b716e3ee57822e034e8c7c01a"} Nov 29 14:47:53 crc kubenswrapper[4907]: I1129 14:47:53.869805 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4dlgt" event={"ID":"2b9ca9f5-7979-47ef-9e37-f9c519b57445","Type":"ContainerStarted","Data":"ed2ecb82573dcc4e122c57c95dc7cc222ca948a3d4a7a17867023d1c102a2441"} Nov 29 14:47:53 crc kubenswrapper[4907]: I1129 14:47:53.996073 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-dpdcs"] Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.004607 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-qhggl"] Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.016154 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-fdjx8"] Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.023114 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-kqq5w"] Nov 29 14:47:54 crc kubenswrapper[4907]: W1129 14:47:54.057775 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod418d0e3c_7354_4e14_b17b_bab93518e78b.slice/crio-59b48ca6ee1ffde0e4550af9fb1bab280c01d71b0cbae9a6315c868d5e8f547e WatchSource:0}: Error finding container 59b48ca6ee1ffde0e4550af9fb1bab280c01d71b0cbae9a6315c868d5e8f547e: Status 404 returned error can't find the container with id 59b48ca6ee1ffde0e4550af9fb1bab280c01d71b0cbae9a6315c868d5e8f547e Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.059075 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-p8qdf"] Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.069081 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-6h9pm"] Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.078851 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.078983 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:47:54 crc kubenswrapper[4907]: E1129 14:47:54.079151 4907 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 29 14:47:54 crc kubenswrapper[4907]: E1129 14:47:54.079269 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs podName:e1f880b2-0e04-48c5-81ca-0103abd439fe nodeName:}" failed. No retries permitted until 2025-11-29 14:47:56.079192379 +0000 UTC m=+1174.066030031 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs") pod "openstack-operator-controller-manager-f5c5f9868-d5gtz" (UID: "e1f880b2-0e04-48c5-81ca-0103abd439fe") : secret "metrics-server-cert" not found Nov 29 14:47:54 crc kubenswrapper[4907]: E1129 14:47:54.079386 4907 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 14:47:54 crc kubenswrapper[4907]: E1129 14:47:54.079625 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs podName:e1f880b2-0e04-48c5-81ca-0103abd439fe nodeName:}" failed. No retries permitted until 2025-11-29 14:47:56.07956074 +0000 UTC m=+1174.066398392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs") pod "openstack-operator-controller-manager-f5c5f9868-d5gtz" (UID: "e1f880b2-0e04-48c5-81ca-0103abd439fe") : secret "webhook-server-cert" not found Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.279458 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kd564"] Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.303467 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-h9tkn"] Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.328982 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh"] Nov 29 14:47:54 crc kubenswrapper[4907]: W1129 14:47:54.340688 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f331ad4_d753_41f4_82f9_c2bd60806987.slice/crio-ec46d50394cbcafc94c0a4a224655f69a49f5c04babda00dadc9bf65e653e856 WatchSource:0}: Error finding container ec46d50394cbcafc94c0a4a224655f69a49f5c04babda00dadc9bf65e653e856: Status 404 returned error can't find the container with id ec46d50394cbcafc94c0a4a224655f69a49f5c04babda00dadc9bf65e653e856 Nov 29 14:47:54 crc kubenswrapper[4907]: E1129 14:47:54.344489 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.51:5001/openstack-k8s-operators/telemetry-operator:a56cff847472bbc2ff74c1f159f60d5390d3c1bf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2wxjm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-86bbb9c7fb-ldhkh_openstack-operators(dceea103-7394-4d01-9168-0b3f5b49306f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 14:47:54 crc kubenswrapper[4907]: E1129 14:47:54.349214 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2wxjm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-86bbb9c7fb-ldhkh_openstack-operators(dceea103-7394-4d01-9168-0b3f5b49306f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 14:47:54 crc kubenswrapper[4907]: E1129 14:47:54.351308 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh" podUID="dceea103-7394-4d01-9168-0b3f5b49306f" Nov 29 14:47:54 crc kubenswrapper[4907]: E1129 14:47:54.352455 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p88rd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-gs88x_openstack-operators(7f331ad4-d753-41f4-82f9-c2bd60806987): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 14:47:54 crc kubenswrapper[4907]: E1129 14:47:54.355366 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p88rd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-gs88x_openstack-operators(7f331ad4-d753-41f4-82f9-c2bd60806987): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 14:47:54 crc kubenswrapper[4907]: E1129 14:47:54.356713 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x" podUID="7f331ad4-d753-41f4-82f9-c2bd60806987" Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.378859 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x"] Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.893188 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-fdjx8" event={"ID":"d193cf7e-774f-44b3-ae22-090d09c15ba5","Type":"ContainerStarted","Data":"0d3840b003eb83160540a6344627178cd0c915bb0595a0f7dd16ac311043327e"} Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.897129 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x" event={"ID":"7f331ad4-d753-41f4-82f9-c2bd60806987","Type":"ContainerStarted","Data":"ec46d50394cbcafc94c0a4a224655f69a49f5c04babda00dadc9bf65e653e856"} Nov 29 14:47:54 crc kubenswrapper[4907]: E1129 14:47:54.901715 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x" podUID="7f331ad4-d753-41f4-82f9-c2bd60806987" Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.907960 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6h9pm" event={"ID":"418d0e3c-7354-4e14-b17b-bab93518e78b","Type":"ContainerStarted","Data":"59b48ca6ee1ffde0e4550af9fb1bab280c01d71b0cbae9a6315c868d5e8f547e"} Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.913750 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-kqq5w" event={"ID":"d9c5b591-4e0f-4f9e-930d-070798fccb44","Type":"ContainerStarted","Data":"61dcfb358f4d124135395c50234623df6022b2b39bc6876f062603e463a4f0b1"} Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.919050 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dpdcs" event={"ID":"45db6747-0449-4839-b0ed-07e930579b83","Type":"ContainerStarted","Data":"613c67b129d2ebbe71050add69908a870bee3c51e8a61d39707ed4d3456dde6f"} Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.921646 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qhggl" event={"ID":"8c2b8c08-4eb0-4a87-bb1f-87e08ff1aa91","Type":"ContainerStarted","Data":"f4161965d9fc1d988dadd666c15fc04071dfa88688c15c15caca99c5d96aa95e"} Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.923642 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-h9tkn" event={"ID":"fd726e7f-5139-4eb6-b18d-24d14682648c","Type":"ContainerStarted","Data":"96ffa389db01fc853ca9e123bd78121354e300c2a1bd01fc7ec390aecda24f4e"} Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.930907 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kd564" event={"ID":"ec0d115a-0b4b-4691-b4f4-778ffd7f6219","Type":"ContainerStarted","Data":"4337d2c9382770c5ead6b11f1aa36553f4de8dd011e83a3cf15eb44b7fc4aa7d"} Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.941682 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh" event={"ID":"dceea103-7394-4d01-9168-0b3f5b49306f","Type":"ContainerStarted","Data":"b9cac831c4a2c9bef9b23f6203a6e46235ad8234177b8ad3e242243f473779a8"} Nov 29 14:47:54 crc kubenswrapper[4907]: I1129 14:47:54.944185 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p8qdf" event={"ID":"3cbe9b24-61e0-449f-a91f-289fd9c5de8e","Type":"ContainerStarted","Data":"24649e103ebc691aedc33fce26cdbfa7fe52fd784e965d5bf73a4205a361f3d8"} Nov 29 14:47:54 crc kubenswrapper[4907]: E1129 14:47:54.948369 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.51:5001/openstack-k8s-operators/telemetry-operator:a56cff847472bbc2ff74c1f159f60d5390d3c1bf\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh" podUID="dceea103-7394-4d01-9168-0b3f5b49306f" Nov 29 14:47:55 crc kubenswrapper[4907]: I1129 14:47:55.410616 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert\") pod \"infra-operator-controller-manager-57548d458d-z92f7\" (UID: \"f9c64e5e-531f-4f11-b7d5-e22ed46b9b86\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" Nov 29 14:47:55 crc kubenswrapper[4907]: E1129 14:47:55.411018 4907 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 14:47:55 crc kubenswrapper[4907]: E1129 14:47:55.411094 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert podName:f9c64e5e-531f-4f11-b7d5-e22ed46b9b86 nodeName:}" failed. No retries permitted until 2025-11-29 14:47:59.411075708 +0000 UTC m=+1177.397913360 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert") pod "infra-operator-controller-manager-57548d458d-z92f7" (UID: "f9c64e5e-531f-4f11-b7d5-e22ed46b9b86") : secret "infra-operator-webhook-server-cert" not found Nov 29 14:47:55 crc kubenswrapper[4907]: I1129 14:47:55.817472 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd\" (UID: \"711dd79a-4219-43a3-9767-aad244b9c68f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" Nov 29 14:47:55 crc kubenswrapper[4907]: E1129 14:47:55.817689 4907 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 14:47:55 crc kubenswrapper[4907]: E1129 14:47:55.817780 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert podName:711dd79a-4219-43a3-9767-aad244b9c68f nodeName:}" failed. No retries permitted until 2025-11-29 14:47:59.817760964 +0000 UTC m=+1177.804598616 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" (UID: "711dd79a-4219-43a3-9767-aad244b9c68f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 14:47:55 crc kubenswrapper[4907]: E1129 14:47:55.976885 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.51:5001/openstack-k8s-operators/telemetry-operator:a56cff847472bbc2ff74c1f159f60d5390d3c1bf\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh" podUID="dceea103-7394-4d01-9168-0b3f5b49306f" Nov 29 14:47:55 crc kubenswrapper[4907]: E1129 14:47:55.977461 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x" podUID="7f331ad4-d753-41f4-82f9-c2bd60806987" Nov 29 14:47:56 crc kubenswrapper[4907]: I1129 14:47:56.121794 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:47:56 crc kubenswrapper[4907]: I1129 14:47:56.122149 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:47:56 crc kubenswrapper[4907]: E1129 14:47:56.122003 4907 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 29 14:47:56 crc kubenswrapper[4907]: E1129 14:47:56.122235 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs podName:e1f880b2-0e04-48c5-81ca-0103abd439fe nodeName:}" failed. No retries permitted until 2025-11-29 14:48:00.122215973 +0000 UTC m=+1178.109053625 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs") pod "openstack-operator-controller-manager-f5c5f9868-d5gtz" (UID: "e1f880b2-0e04-48c5-81ca-0103abd439fe") : secret "metrics-server-cert" not found Nov 29 14:47:56 crc kubenswrapper[4907]: E1129 14:47:56.122305 4907 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 14:47:56 crc kubenswrapper[4907]: E1129 14:47:56.122361 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs podName:e1f880b2-0e04-48c5-81ca-0103abd439fe nodeName:}" failed. No retries permitted until 2025-11-29 14:48:00.122346586 +0000 UTC m=+1178.109184238 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs") pod "openstack-operator-controller-manager-f5c5f9868-d5gtz" (UID: "e1f880b2-0e04-48c5-81ca-0103abd439fe") : secret "webhook-server-cert" not found Nov 29 14:47:58 crc kubenswrapper[4907]: I1129 14:47:58.490479 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:47:58 crc kubenswrapper[4907]: I1129 14:47:58.490545 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:47:58 crc kubenswrapper[4907]: I1129 14:47:58.491846 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:47:58 crc kubenswrapper[4907]: I1129 14:47:58.492706 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b8e5b56ee968d515ff618b3f298ba561ab70814c2dd33e300a89c15ce55549c1"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 14:47:58 crc kubenswrapper[4907]: I1129 14:47:58.492786 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://b8e5b56ee968d515ff618b3f298ba561ab70814c2dd33e300a89c15ce55549c1" gracePeriod=600 Nov 29 14:47:59 crc kubenswrapper[4907]: I1129 14:47:59.015090 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="b8e5b56ee968d515ff618b3f298ba561ab70814c2dd33e300a89c15ce55549c1" exitCode=0 Nov 29 14:47:59 crc kubenswrapper[4907]: I1129 14:47:59.015168 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"b8e5b56ee968d515ff618b3f298ba561ab70814c2dd33e300a89c15ce55549c1"} Nov 29 14:47:59 crc kubenswrapper[4907]: I1129 14:47:59.015359 4907 scope.go:117] "RemoveContainer" containerID="81cf87bbb8090f9964b6b2dbf0b6be6946b5091a7113f8782940ac4da5885e64" Nov 29 14:47:59 crc kubenswrapper[4907]: I1129 14:47:59.425735 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert\") pod \"infra-operator-controller-manager-57548d458d-z92f7\" (UID: \"f9c64e5e-531f-4f11-b7d5-e22ed46b9b86\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" Nov 29 14:47:59 crc kubenswrapper[4907]: E1129 14:47:59.425958 4907 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 14:47:59 crc kubenswrapper[4907]: E1129 14:47:59.426041 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert podName:f9c64e5e-531f-4f11-b7d5-e22ed46b9b86 nodeName:}" failed. No retries permitted until 2025-11-29 14:48:07.426014798 +0000 UTC m=+1185.412852530 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert") pod "infra-operator-controller-manager-57548d458d-z92f7" (UID: "f9c64e5e-531f-4f11-b7d5-e22ed46b9b86") : secret "infra-operator-webhook-server-cert" not found Nov 29 14:47:59 crc kubenswrapper[4907]: I1129 14:47:59.833324 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd\" (UID: \"711dd79a-4219-43a3-9767-aad244b9c68f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" Nov 29 14:47:59 crc kubenswrapper[4907]: E1129 14:47:59.833546 4907 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 14:47:59 crc kubenswrapper[4907]: E1129 14:47:59.833603 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert podName:711dd79a-4219-43a3-9767-aad244b9c68f nodeName:}" failed. No retries permitted until 2025-11-29 14:48:07.833586099 +0000 UTC m=+1185.820423751 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" (UID: "711dd79a-4219-43a3-9767-aad244b9c68f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 14:48:00 crc kubenswrapper[4907]: I1129 14:48:00.139262 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:48:00 crc kubenswrapper[4907]: I1129 14:48:00.139435 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:48:00 crc kubenswrapper[4907]: E1129 14:48:00.139764 4907 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 14:48:00 crc kubenswrapper[4907]: E1129 14:48:00.139901 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs podName:e1f880b2-0e04-48c5-81ca-0103abd439fe nodeName:}" failed. No retries permitted until 2025-11-29 14:48:08.1398649 +0000 UTC m=+1186.126702622 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs") pod "openstack-operator-controller-manager-f5c5f9868-d5gtz" (UID: "e1f880b2-0e04-48c5-81ca-0103abd439fe") : secret "webhook-server-cert" not found Nov 29 14:48:00 crc kubenswrapper[4907]: E1129 14:48:00.139760 4907 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 29 14:48:00 crc kubenswrapper[4907]: E1129 14:48:00.140656 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs podName:e1f880b2-0e04-48c5-81ca-0103abd439fe nodeName:}" failed. No retries permitted until 2025-11-29 14:48:08.140625521 +0000 UTC m=+1186.127463223 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs") pod "openstack-operator-controller-manager-f5c5f9868-d5gtz" (UID: "e1f880b2-0e04-48c5-81ca-0103abd439fe") : secret "metrics-server-cert" not found Nov 29 14:48:06 crc kubenswrapper[4907]: E1129 14:48:06.557736 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Nov 29 14:48:06 crc kubenswrapper[4907]: E1129 14:48:06.558744 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bk6kd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-brs2h_openstack-operators(cdcaa6fe-2208-49d5-82d8-9b2c96be251d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:48:07 crc kubenswrapper[4907]: E1129 14:48:07.255607 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Nov 29 14:48:07 crc kubenswrapper[4907]: E1129 14:48:07.256214 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tlrpl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-fdjx8_openstack-operators(d193cf7e-774f-44b3-ae22-090d09c15ba5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:48:07 crc kubenswrapper[4907]: I1129 14:48:07.513642 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert\") pod \"infra-operator-controller-manager-57548d458d-z92f7\" (UID: \"f9c64e5e-531f-4f11-b7d5-e22ed46b9b86\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" Nov 29 14:48:07 crc kubenswrapper[4907]: I1129 14:48:07.533758 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c64e5e-531f-4f11-b7d5-e22ed46b9b86-cert\") pod \"infra-operator-controller-manager-57548d458d-z92f7\" (UID: \"f9c64e5e-531f-4f11-b7d5-e22ed46b9b86\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" Nov 29 14:48:07 crc kubenswrapper[4907]: I1129 14:48:07.576928 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" Nov 29 14:48:07 crc kubenswrapper[4907]: I1129 14:48:07.923729 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd\" (UID: \"711dd79a-4219-43a3-9767-aad244b9c68f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" Nov 29 14:48:07 crc kubenswrapper[4907]: I1129 14:48:07.942641 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/711dd79a-4219-43a3-9767-aad244b9c68f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd\" (UID: \"711dd79a-4219-43a3-9767-aad244b9c68f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" Nov 29 14:48:08 crc kubenswrapper[4907]: I1129 14:48:08.014511 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" Nov 29 14:48:08 crc kubenswrapper[4907]: I1129 14:48:08.230600 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:48:08 crc kubenswrapper[4907]: I1129 14:48:08.231998 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:48:08 crc kubenswrapper[4907]: I1129 14:48:08.235751 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-metrics-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:48:08 crc kubenswrapper[4907]: I1129 14:48:08.239256 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1f880b2-0e04-48c5-81ca-0103abd439fe-webhook-certs\") pod \"openstack-operator-controller-manager-f5c5f9868-d5gtz\" (UID: \"e1f880b2-0e04-48c5-81ca-0103abd439fe\") " pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:48:08 crc kubenswrapper[4907]: I1129 14:48:08.486824 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:48:19 crc kubenswrapper[4907]: E1129 14:48:19.627890 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Nov 29 14:48:19 crc kubenswrapper[4907]: E1129 14:48:19.629955 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2n5sg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-kqbx2_openstack-operators(71e0b5bc-68d6-434d-97c4-0c6d3a324e15): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:48:19 crc kubenswrapper[4907]: E1129 14:48:19.647796 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Nov 29 14:48:19 crc kubenswrapper[4907]: E1129 14:48:19.648065 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gl6bf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-6h9pm_openstack-operators(418d0e3c-7354-4e14-b17b-bab93518e78b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:48:23 crc kubenswrapper[4907]: E1129 14:48:23.812740 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Nov 29 14:48:23 crc kubenswrapper[4907]: E1129 14:48:23.814497 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b4mpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-xqw5l_openstack-operators(e12e8dfe-6b2d-49f4-90d1-3165ec08f043): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:48:24 crc kubenswrapper[4907]: E1129 14:48:24.429847 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d" Nov 29 14:48:24 crc kubenswrapper[4907]: E1129 14:48:24.430250 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xcnbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-h9tkn_openstack-operators(fd726e7f-5139-4eb6-b18d-24d14682648c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:48:24 crc kubenswrapper[4907]: E1129 14:48:24.963865 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Nov 29 14:48:24 crc kubenswrapper[4907]: E1129 14:48:24.964015 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m869c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-kd564_openstack-operators(ec0d115a-0b4b-4691-b4f4-778ffd7f6219): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:48:24 crc kubenswrapper[4907]: E1129 14:48:24.965098 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kd564" podUID="ec0d115a-0b4b-4691-b4f4-778ffd7f6219" Nov 29 14:48:25 crc kubenswrapper[4907]: E1129 14:48:25.262592 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kd564" podUID="ec0d115a-0b4b-4691-b4f4-778ffd7f6219" Nov 29 14:48:25 crc kubenswrapper[4907]: E1129 14:48:25.441908 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3" Nov 29 14:48:25 crc kubenswrapper[4907]: E1129 14:48:25.442338 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d7q6p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-546d4bdf48-kqq5w_openstack-operators(d9c5b591-4e0f-4f9e-930d-070798fccb44): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:48:26 crc kubenswrapper[4907]: E1129 14:48:26.765113 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Nov 29 14:48:26 crc kubenswrapper[4907]: E1129 14:48:26.766195 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xcfrn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-p8qdf_openstack-operators(3cbe9b24-61e0-449f-a91f-289fd9c5de8e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:48:27 crc kubenswrapper[4907]: I1129 14:48:27.939453 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd"] Nov 29 14:48:27 crc kubenswrapper[4907]: I1129 14:48:27.989612 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-z92f7"] Nov 29 14:48:28 crc kubenswrapper[4907]: I1129 14:48:28.011904 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz"] Nov 29 14:48:28 crc kubenswrapper[4907]: I1129 14:48:28.294860 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"d1627f8336c2b950c2441aa29da8e2bbe0bedafb1bb7676292ab2c4a335d23b1"} Nov 29 14:48:28 crc kubenswrapper[4907]: I1129 14:48:28.301941 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" event={"ID":"711dd79a-4219-43a3-9767-aad244b9c68f","Type":"ContainerStarted","Data":"3c92c93baba8aa138cc618df92e163654db96a5e2a8000547594b65c3bdc81b6"} Nov 29 14:48:28 crc kubenswrapper[4907]: I1129 14:48:28.305608 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" event={"ID":"e1f880b2-0e04-48c5-81ca-0103abd439fe","Type":"ContainerStarted","Data":"f01df59999e934b12c5647d967ebd0b7a808780511cc42c7dcde505bb7d05efa"} Nov 29 14:48:28 crc kubenswrapper[4907]: I1129 14:48:28.307167 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" event={"ID":"f9c64e5e-531f-4f11-b7d5-e22ed46b9b86","Type":"ContainerStarted","Data":"54d48523d981586b59e7a70f453285dc062729e349bc759f3a4adeee2ac82533"} Nov 29 14:48:29 crc kubenswrapper[4907]: I1129 14:48:29.334146 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x" event={"ID":"7f331ad4-d753-41f4-82f9-c2bd60806987","Type":"ContainerStarted","Data":"4282421a4f0a6d96d4652b433d3dfda47939a7c4077899f3d0ba22987d556d54"} Nov 29 14:48:29 crc kubenswrapper[4907]: I1129 14:48:29.342310 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-b8f6h" event={"ID":"958f375f-e7a8-4d96-b2a1-dc5a63cdc865","Type":"ContainerStarted","Data":"19b4a45dcc8780c90c03eb027413dd99f43de949e438cc3471230668316cc6c4"} Nov 29 14:48:29 crc kubenswrapper[4907]: I1129 14:48:29.350810 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wspkj" event={"ID":"cf7efbf1-79c8-45f9-8bed-7a33f47226ef","Type":"ContainerStarted","Data":"79d0189a3e82be8d1a05c2d189df4a24b0bf9d34db3cf10647fce1bda7846731"} Nov 29 14:48:29 crc kubenswrapper[4907]: I1129 14:48:29.369268 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-28hrq" event={"ID":"aca0ecce-183f-40cd-8ab0-aed5caf29556","Type":"ContainerStarted","Data":"320efd259b13d40b4cf47c1ac76810dfcde52503b2b7bfdbb1692434068cf24c"} Nov 29 14:48:29 crc kubenswrapper[4907]: I1129 14:48:29.375579 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4dlgt" event={"ID":"2b9ca9f5-7979-47ef-9e37-f9c519b57445","Type":"ContainerStarted","Data":"e000bd01719a659d1a62fafcfa3dfc26ede32da76edb4ad757061a72c1d36885"} Nov 29 14:48:29 crc kubenswrapper[4907]: I1129 14:48:29.377154 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dpdcs" event={"ID":"45db6747-0449-4839-b0ed-07e930579b83","Type":"ContainerStarted","Data":"abb53f1260231fa788e9c6c0cad862aca3ff7777b8158c1426e5264ccd26b492"} Nov 29 14:48:29 crc kubenswrapper[4907]: I1129 14:48:29.378653 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jm57w" event={"ID":"103b9723-75c0-41ca-8264-41912d22a5cb","Type":"ContainerStarted","Data":"e9c1a7fde1d9685277ceeb03fe8fbef1985bbfafe8ee9fd89bb99df61e737fb6"} Nov 29 14:48:29 crc kubenswrapper[4907]: I1129 14:48:29.379902 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-28vdp" event={"ID":"bbc59bc4-78c2-4534-b1bd-93cf4b60f86e","Type":"ContainerStarted","Data":"40bedf133bbc574ccfaa834a3bec23d05471149ece9a495a09ceb5a6c26c7762"} Nov 29 14:48:29 crc kubenswrapper[4907]: I1129 14:48:29.385369 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zspmk" event={"ID":"46a74794-f3b0-4bf0-9c94-0920441fd3ce","Type":"ContainerStarted","Data":"eb57625856494f005112e83181a02ec9e608fc5dd5a8b88b509bbf526a664fca"} Nov 29 14:48:29 crc kubenswrapper[4907]: I1129 14:48:29.389157 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qhggl" event={"ID":"8c2b8c08-4eb0-4a87-bb1f-87e08ff1aa91","Type":"ContainerStarted","Data":"39fbe34f4ea9fb3b9c8b12cfaa9c994850d63a265d7479fccc0496db0ab760dc"} Nov 29 14:48:35 crc kubenswrapper[4907]: I1129 14:48:35.460240 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" event={"ID":"e1f880b2-0e04-48c5-81ca-0103abd439fe","Type":"ContainerStarted","Data":"6d2a449f49f413aaabc15e36ea1949f5276e8b09c29f7cc976516d00225b7b65"} Nov 29 14:48:35 crc kubenswrapper[4907]: I1129 14:48:35.461031 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:48:35 crc kubenswrapper[4907]: I1129 14:48:35.492128 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" podStartSLOduration=44.49211102 podStartE2EDuration="44.49211102s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:48:35.487980963 +0000 UTC m=+1213.474818615" watchObservedRunningTime="2025-11-29 14:48:35.49211102 +0000 UTC m=+1213.478948662" Nov 29 14:48:38 crc kubenswrapper[4907]: I1129 14:48:38.507852 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh" event={"ID":"dceea103-7394-4d01-9168-0b3f5b49306f","Type":"ContainerStarted","Data":"7957a702c45806ff522d93bd597d441d12e74b9490c9280b7cf0c55aef39dcd1"} Nov 29 14:48:38 crc kubenswrapper[4907]: E1129 14:48:38.993452 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-kqq5w" podUID="d9c5b591-4e0f-4f9e-930d-070798fccb44" Nov 29 14:48:39 crc kubenswrapper[4907]: E1129 14:48:39.018113 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-fdjx8" podUID="d193cf7e-774f-44b3-ae22-090d09c15ba5" Nov 29 14:48:39 crc kubenswrapper[4907]: E1129 14:48:39.135562 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p8qdf" podUID="3cbe9b24-61e0-449f-a91f-289fd9c5de8e" Nov 29 14:48:39 crc kubenswrapper[4907]: E1129 14:48:39.170965 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-xqw5l" podUID="e12e8dfe-6b2d-49f4-90d1-3165ec08f043" Nov 29 14:48:39 crc kubenswrapper[4907]: E1129 14:48:39.226827 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kqbx2" podUID="71e0b5bc-68d6-434d-97c4-0c6d3a324e15" Nov 29 14:48:39 crc kubenswrapper[4907]: E1129 14:48:39.324736 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6h9pm" podUID="418d0e3c-7354-4e14-b17b-bab93518e78b" Nov 29 14:48:39 crc kubenswrapper[4907]: E1129 14:48:39.414101 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-h9tkn" podUID="fd726e7f-5139-4eb6-b18d-24d14682648c" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.543537 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kqbx2" event={"ID":"71e0b5bc-68d6-434d-97c4-0c6d3a324e15","Type":"ContainerStarted","Data":"01f230c03c8371a8885a3fff5237e6258d2be6ea15ec4a25cd20f5185b2f015a"} Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.556714 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-h9tkn" event={"ID":"fd726e7f-5139-4eb6-b18d-24d14682648c","Type":"ContainerStarted","Data":"f8e0cd600425a59015a4ff0e04adc4088005026a965a6494bc277aa395b30447"} Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.580729 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6h9pm" event={"ID":"418d0e3c-7354-4e14-b17b-bab93518e78b","Type":"ContainerStarted","Data":"af4d250af124edda8a38fe8b2e4a53466d74a171c435dd0f0e6aebd15ab6706b"} Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.591488 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" event={"ID":"f9c64e5e-531f-4f11-b7d5-e22ed46b9b86","Type":"ContainerStarted","Data":"43dd37fcf49e6071f4ef6a43564aff2a87fcbec0ceebe608bcb5f7cd6b4da89b"} Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.610708 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" event={"ID":"711dd79a-4219-43a3-9767-aad244b9c68f","Type":"ContainerStarted","Data":"c035d7e4840ea462ddbeecfebf9fec8f716b29b42d870277a123446da83463ad"} Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.632284 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-28vdp" event={"ID":"bbc59bc4-78c2-4534-b1bd-93cf4b60f86e","Type":"ContainerStarted","Data":"48ecb2fe3f0f314e687ca84afc93164c1d0b156e384432cca070ffe6d6942e96"} Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.634489 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-28vdp" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.648938 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-28vdp" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.687820 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-kqq5w" event={"ID":"d9c5b591-4e0f-4f9e-930d-070798fccb44","Type":"ContainerStarted","Data":"86e293211456981296ccf18956773b0c3b43d8bc7366e2d8b0e4d6a569e66974"} Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.715684 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh" event={"ID":"dceea103-7394-4d01-9168-0b3f5b49306f","Type":"ContainerStarted","Data":"a8afa07e633459e064fc88c3dcd200397c2693fd2a1174b35086644cf1541cb2"} Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.716708 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.752126 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p8qdf" event={"ID":"3cbe9b24-61e0-449f-a91f-289fd9c5de8e","Type":"ContainerStarted","Data":"60567541016f466406f73dbe806582cc03ef61d672921b1352b053a3be5793cc"} Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.757425 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-28vdp" podStartSLOduration=3.320335864 podStartE2EDuration="48.757402825s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:52.993621883 +0000 UTC m=+1170.980459535" lastFinishedPulling="2025-11-29 14:48:38.430688844 +0000 UTC m=+1216.417526496" observedRunningTime="2025-11-29 14:48:39.731332156 +0000 UTC m=+1217.718169798" watchObservedRunningTime="2025-11-29 14:48:39.757402825 +0000 UTC m=+1217.744240477" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.792320 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dpdcs" event={"ID":"45db6747-0449-4839-b0ed-07e930579b83","Type":"ContainerStarted","Data":"86a11cb2aba1975f36e830edf360b98efb44f06d790239056dab26a424ac82ed"} Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.793326 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dpdcs" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.813179 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dpdcs" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.836063 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-fdjx8" event={"ID":"d193cf7e-774f-44b3-ae22-090d09c15ba5","Type":"ContainerStarted","Data":"fbe4eeffe2f68f9a0fab3bcdc012cc749da249e41a921c650970114ef84597cf"} Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.877931 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh" podStartSLOduration=14.997131002 podStartE2EDuration="48.87791542s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:54.344307203 +0000 UTC m=+1172.331144855" lastFinishedPulling="2025-11-29 14:48:28.225091621 +0000 UTC m=+1206.211929273" observedRunningTime="2025-11-29 14:48:39.812757344 +0000 UTC m=+1217.799594996" watchObservedRunningTime="2025-11-29 14:48:39.87791542 +0000 UTC m=+1217.864753072" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.909129 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dpdcs" podStartSLOduration=4.598336794 podStartE2EDuration="48.909117005s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:54.038533717 +0000 UTC m=+1172.025371369" lastFinishedPulling="2025-11-29 14:48:38.349313928 +0000 UTC m=+1216.336151580" observedRunningTime="2025-11-29 14:48:39.898772392 +0000 UTC m=+1217.885610044" watchObservedRunningTime="2025-11-29 14:48:39.909117005 +0000 UTC m=+1217.895954657" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.914156 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x" event={"ID":"7f331ad4-d753-41f4-82f9-c2bd60806987","Type":"ContainerStarted","Data":"2caeab7df247fbbf22c0adba8e21204d026282604f8a3677930eb8d251a7283a"} Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.916332 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.942851 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.944728 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-28hrq" event={"ID":"aca0ecce-183f-40cd-8ab0-aed5caf29556","Type":"ContainerStarted","Data":"bc924fc40aeca890a4efd1adfb3ff2ef1bbc1baf5af23755affb5dbdbacd54d1"} Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.945574 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-28hrq" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.951978 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-28hrq" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.952345 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zspmk" event={"ID":"46a74794-f3b0-4bf0-9c94-0920441fd3ce","Type":"ContainerStarted","Data":"dafd76fd15d79e86f7fe3251446d823d177a878abaf83acad476eca436d18994"} Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.954229 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zspmk" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.970601 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zspmk" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.975346 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4dlgt" event={"ID":"2b9ca9f5-7979-47ef-9e37-f9c519b57445","Type":"ContainerStarted","Data":"96588b11ea48174f1e5154ea249c7eaf19282a29c8be062521fa2f83c13d7117"} Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.976303 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4dlgt" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.993974 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4dlgt" Nov 29 14:48:39 crc kubenswrapper[4907]: I1129 14:48:39.995200 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-xqw5l" event={"ID":"e12e8dfe-6b2d-49f4-90d1-3165ec08f043","Type":"ContainerStarted","Data":"6bc59ec786ca1c26ed0f384c081d898024346156b01188c5cc84646a76b13c6c"} Nov 29 14:48:40 crc kubenswrapper[4907]: I1129 14:48:40.016889 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zspmk" podStartSLOduration=4.257682629 podStartE2EDuration="49.016871929s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:53.635291798 +0000 UTC m=+1171.622129450" lastFinishedPulling="2025-11-29 14:48:38.394481098 +0000 UTC m=+1216.381318750" observedRunningTime="2025-11-29 14:48:40.016141078 +0000 UTC m=+1218.002978730" watchObservedRunningTime="2025-11-29 14:48:40.016871929 +0000 UTC m=+1218.003709581" Nov 29 14:48:40 crc kubenswrapper[4907]: I1129 14:48:40.094556 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-28hrq" podStartSLOduration=3.570979788 podStartE2EDuration="49.09454013s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:52.822689359 +0000 UTC m=+1170.809527011" lastFinishedPulling="2025-11-29 14:48:38.346249701 +0000 UTC m=+1216.333087353" observedRunningTime="2025-11-29 14:48:40.08500761 +0000 UTC m=+1218.071845252" watchObservedRunningTime="2025-11-29 14:48:40.09454013 +0000 UTC m=+1218.081377782" Nov 29 14:48:40 crc kubenswrapper[4907]: I1129 14:48:40.094950 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gs88x" podStartSLOduration=5.021129257 podStartE2EDuration="49.094946422s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:54.35229666 +0000 UTC m=+1172.339134312" lastFinishedPulling="2025-11-29 14:48:38.426113835 +0000 UTC m=+1216.412951477" observedRunningTime="2025-11-29 14:48:40.056571184 +0000 UTC m=+1218.043408836" watchObservedRunningTime="2025-11-29 14:48:40.094946422 +0000 UTC m=+1218.081784074" Nov 29 14:48:40 crc kubenswrapper[4907]: I1129 14:48:40.156378 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4dlgt" podStartSLOduration=4.136596737 podStartE2EDuration="49.156360812s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:53.404647341 +0000 UTC m=+1171.391484993" lastFinishedPulling="2025-11-29 14:48:38.424411426 +0000 UTC m=+1216.411249068" observedRunningTime="2025-11-29 14:48:40.146833352 +0000 UTC m=+1218.133671004" watchObservedRunningTime="2025-11-29 14:48:40.156360812 +0000 UTC m=+1218.143198464" Nov 29 14:48:40 crc kubenswrapper[4907]: E1129 14:48:40.227846 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-brs2h" podUID="cdcaa6fe-2208-49d5-82d8-9b2c96be251d" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.020011 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" event={"ID":"f9c64e5e-531f-4f11-b7d5-e22ed46b9b86","Type":"ContainerStarted","Data":"7c946f1a758744f3d82699847808a2d798ae757393da1e04f910c94c0f2edc0d"} Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.020360 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.022418 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-b8f6h" event={"ID":"958f375f-e7a8-4d96-b2a1-dc5a63cdc865","Type":"ContainerStarted","Data":"2a4b4c127210a5a0cefedc21fe1a808e2391f13a08fb0ee0b239b155b2705237"} Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.022807 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-b8f6h" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.025090 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-b8f6h" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.025651 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" event={"ID":"711dd79a-4219-43a3-9767-aad244b9c68f","Type":"ContainerStarted","Data":"eb502e5bef0bdeba1174d56b040e79a7c2b8a029dc861c41fb858dbaccd8ddd7"} Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.025749 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.026893 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-brs2h" event={"ID":"cdcaa6fe-2208-49d5-82d8-9b2c96be251d","Type":"ContainerStarted","Data":"fff1c556480c4a461042b55692ad57a9b09c4082ffdcb06157cb555b0372ea4a"} Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.034300 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qhggl" event={"ID":"8c2b8c08-4eb0-4a87-bb1f-87e08ff1aa91","Type":"ContainerStarted","Data":"75a276db708659628c8cfb40aad15d2f25b8f0e9a9adae913e5171ffe6bde2cb"} Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.034672 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qhggl" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.045578 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" podStartSLOduration=40.110771575 podStartE2EDuration="50.045563064s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:48:28.212835313 +0000 UTC m=+1206.199672965" lastFinishedPulling="2025-11-29 14:48:38.147626802 +0000 UTC m=+1216.134464454" observedRunningTime="2025-11-29 14:48:41.040864681 +0000 UTC m=+1219.027702323" watchObservedRunningTime="2025-11-29 14:48:41.045563064 +0000 UTC m=+1219.032400716" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.045900 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-h9tkn" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.058667 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qhggl" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.077560 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jm57w" event={"ID":"103b9723-75c0-41ca-8264-41912d22a5cb","Type":"ContainerStarted","Data":"a2edf5e7a7d47807db7e4eaa0d1cbfbc21f792e4193d7290fb2ca2f6720e84d2"} Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.078290 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jm57w" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.083705 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kd564" event={"ID":"ec0d115a-0b4b-4691-b4f4-778ffd7f6219","Type":"ContainerStarted","Data":"170164f72d3a47fae9e19e9b0b05654713740164879b6bd76c5d7ca80b14da07"} Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.089225 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jm57w" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.099013 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-fdjx8" event={"ID":"d193cf7e-774f-44b3-ae22-090d09c15ba5","Type":"ContainerStarted","Data":"37a1b8ff3d970d3f002e8a7e31cc6ad4544acda319d122f5e75f1ec02059e8d5"} Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.099241 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-fdjx8" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.109562 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6h9pm" event={"ID":"418d0e3c-7354-4e14-b17b-bab93518e78b","Type":"ContainerStarted","Data":"3c3a6674470b94d82c039ae28c80bc999edefe8a6797fe8188e588a4e8108667"} Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.110454 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6h9pm" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.120767 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" podStartSLOduration=40.186171683 podStartE2EDuration="50.120752125s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:48:28.212822053 +0000 UTC m=+1206.199659705" lastFinishedPulling="2025-11-29 14:48:38.147402475 +0000 UTC m=+1216.134240147" observedRunningTime="2025-11-29 14:48:41.103342982 +0000 UTC m=+1219.090180634" watchObservedRunningTime="2025-11-29 14:48:41.120752125 +0000 UTC m=+1219.107589777" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.127385 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-qhggl" podStartSLOduration=5.718125591 podStartE2EDuration="50.127371703s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:54.007144447 +0000 UTC m=+1171.993982099" lastFinishedPulling="2025-11-29 14:48:38.416390549 +0000 UTC m=+1216.403228211" observedRunningTime="2025-11-29 14:48:41.120086516 +0000 UTC m=+1219.106924168" watchObservedRunningTime="2025-11-29 14:48:41.127371703 +0000 UTC m=+1219.114209355" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.132137 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wspkj" event={"ID":"cf7efbf1-79c8-45f9-8bed-7a33f47226ef","Type":"ContainerStarted","Data":"cc0fa9f0a60db4ce9387ca50386ed52613f2ce56185731899fa242e5f32d900d"} Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.132351 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wspkj" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.133740 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wspkj" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.150291 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-b8f6h" podStartSLOduration=4.115737207 podStartE2EDuration="50.150275452s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:52.360036536 +0000 UTC m=+1170.346874188" lastFinishedPulling="2025-11-29 14:48:38.394574781 +0000 UTC m=+1216.381412433" observedRunningTime="2025-11-29 14:48:41.140786353 +0000 UTC m=+1219.127624005" watchObservedRunningTime="2025-11-29 14:48:41.150275452 +0000 UTC m=+1219.137113104" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.233398 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-fdjx8" podStartSLOduration=3.6787871130000003 podStartE2EDuration="50.233382097s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:54.01535562 +0000 UTC m=+1172.002193272" lastFinishedPulling="2025-11-29 14:48:40.569950594 +0000 UTC m=+1218.556788256" observedRunningTime="2025-11-29 14:48:41.231623747 +0000 UTC m=+1219.218461399" watchObservedRunningTime="2025-11-29 14:48:41.233382097 +0000 UTC m=+1219.220219749" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.253163 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6h9pm" podStartSLOduration=3.9030733189999998 podStartE2EDuration="50.253146987s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:54.061688433 +0000 UTC m=+1172.048526085" lastFinishedPulling="2025-11-29 14:48:40.411762101 +0000 UTC m=+1218.398599753" observedRunningTime="2025-11-29 14:48:41.248109124 +0000 UTC m=+1219.234946776" watchObservedRunningTime="2025-11-29 14:48:41.253146987 +0000 UTC m=+1219.239984629" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.339853 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-h9tkn" podStartSLOduration=4.048744258 podStartE2EDuration="50.339837874s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:54.343975664 +0000 UTC m=+1172.330813316" lastFinishedPulling="2025-11-29 14:48:40.63506928 +0000 UTC m=+1218.621906932" observedRunningTime="2025-11-29 14:48:41.330066127 +0000 UTC m=+1219.316903769" watchObservedRunningTime="2025-11-29 14:48:41.339837874 +0000 UTC m=+1219.326675526" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.340260 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wspkj" podStartSLOduration=4.704086252 podStartE2EDuration="50.340256396s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:52.759011494 +0000 UTC m=+1170.745849136" lastFinishedPulling="2025-11-29 14:48:38.395181628 +0000 UTC m=+1216.382019280" observedRunningTime="2025-11-29 14:48:41.280974696 +0000 UTC m=+1219.267812348" watchObservedRunningTime="2025-11-29 14:48:41.340256396 +0000 UTC m=+1219.327094048" Nov 29 14:48:41 crc kubenswrapper[4907]: I1129 14:48:41.386398 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jm57w" podStartSLOduration=5.299522257 podStartE2EDuration="50.386381983s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:53.339904617 +0000 UTC m=+1171.326742269" lastFinishedPulling="2025-11-29 14:48:38.426764343 +0000 UTC m=+1216.413601995" observedRunningTime="2025-11-29 14:48:41.385096297 +0000 UTC m=+1219.371933949" watchObservedRunningTime="2025-11-29 14:48:41.386381983 +0000 UTC m=+1219.373219625" Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.140541 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-kqq5w" event={"ID":"d9c5b591-4e0f-4f9e-930d-070798fccb44","Type":"ContainerStarted","Data":"e2fbe54e34a2e48daf168d20d1bf15b9ede4f73561d17529b32880d8747973e7"} Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.141361 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-kqq5w" Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.142728 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p8qdf" event={"ID":"3cbe9b24-61e0-449f-a91f-289fd9c5de8e","Type":"ContainerStarted","Data":"d5a88457ff7c0d8f1fb3d72af14cea4f08c6b82dd072f12333563d4400c13b1b"} Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.142854 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p8qdf" Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.144726 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-xqw5l" event={"ID":"e12e8dfe-6b2d-49f4-90d1-3165ec08f043","Type":"ContainerStarted","Data":"4f42b1b0d15db9f82d4ef35714f5fcf728d8396f6bdab81e333bc418c380bc2a"} Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.144826 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-xqw5l" Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.147146 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-brs2h" event={"ID":"cdcaa6fe-2208-49d5-82d8-9b2c96be251d","Type":"ContainerStarted","Data":"c83c0401c8cbb459a0854e51937368ca618fd8b8786c18dc20c5e8c1c23952d1"} Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.147258 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-brs2h" Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.149215 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kqbx2" event={"ID":"71e0b5bc-68d6-434d-97c4-0c6d3a324e15","Type":"ContainerStarted","Data":"a0ec152ff42a86c13efe95aa0d1a38bac4f3fade31fb6e04d170fd0b8fe35672"} Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.149363 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kqbx2" Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.151256 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-h9tkn" event={"ID":"fd726e7f-5139-4eb6-b18d-24d14682648c","Type":"ContainerStarted","Data":"623555b87c0cb568a4f6df16beb00c80a67d8e67d51ba5467949c4aae9e704df"} Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.168959 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-kqq5w" podStartSLOduration=4.575299302 podStartE2EDuration="51.168944243s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:54.038191427 +0000 UTC m=+1172.025029079" lastFinishedPulling="2025-11-29 14:48:40.631836358 +0000 UTC m=+1218.618674020" observedRunningTime="2025-11-29 14:48:42.166762101 +0000 UTC m=+1220.153599753" watchObservedRunningTime="2025-11-29 14:48:42.168944243 +0000 UTC m=+1220.155781885" Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.172572 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kd564" podStartSLOduration=4.073490432 podStartE2EDuration="50.172565675s" podCreationTimestamp="2025-11-29 14:47:52 +0000 UTC" firstStartedPulling="2025-11-29 14:47:54.311259607 +0000 UTC m=+1172.298097259" lastFinishedPulling="2025-11-29 14:48:40.41033485 +0000 UTC m=+1218.397172502" observedRunningTime="2025-11-29 14:48:41.414205132 +0000 UTC m=+1219.401042784" watchObservedRunningTime="2025-11-29 14:48:42.172565675 +0000 UTC m=+1220.159403327" Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.200061 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p8qdf" podStartSLOduration=4.637996009 podStartE2EDuration="51.200047154s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:54.069813244 +0000 UTC m=+1172.056650896" lastFinishedPulling="2025-11-29 14:48:40.631864389 +0000 UTC m=+1218.618702041" observedRunningTime="2025-11-29 14:48:42.194763895 +0000 UTC m=+1220.181601547" watchObservedRunningTime="2025-11-29 14:48:42.200047154 +0000 UTC m=+1220.186884806" Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.236694 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-xqw5l" podStartSLOduration=4.165208919 podStartE2EDuration="51.236678703s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:53.645803146 +0000 UTC m=+1171.632640798" lastFinishedPulling="2025-11-29 14:48:40.71727293 +0000 UTC m=+1218.704110582" observedRunningTime="2025-11-29 14:48:42.232457583 +0000 UTC m=+1220.219295235" watchObservedRunningTime="2025-11-29 14:48:42.236678703 +0000 UTC m=+1220.223516355" Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.299633 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kqbx2" podStartSLOduration=3.484074825 podStartE2EDuration="51.299617636s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:52.818324195 +0000 UTC m=+1170.805161837" lastFinishedPulling="2025-11-29 14:48:40.633866996 +0000 UTC m=+1218.620704648" observedRunningTime="2025-11-29 14:48:42.277340845 +0000 UTC m=+1220.264178497" watchObservedRunningTime="2025-11-29 14:48:42.299617636 +0000 UTC m=+1220.286455288" Nov 29 14:48:42 crc kubenswrapper[4907]: I1129 14:48:42.303026 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-brs2h" podStartSLOduration=2.61449641 podStartE2EDuration="51.303018083s" podCreationTimestamp="2025-11-29 14:47:51 +0000 UTC" firstStartedPulling="2025-11-29 14:47:52.825595521 +0000 UTC m=+1170.812433173" lastFinishedPulling="2025-11-29 14:48:41.514117194 +0000 UTC m=+1219.500954846" observedRunningTime="2025-11-29 14:48:42.300780289 +0000 UTC m=+1220.287617941" watchObservedRunningTime="2025-11-29 14:48:42.303018083 +0000 UTC m=+1220.289855735" Nov 29 14:48:47 crc kubenswrapper[4907]: I1129 14:48:47.585891 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-z92f7" Nov 29 14:48:48 crc kubenswrapper[4907]: I1129 14:48:48.022127 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd" Nov 29 14:48:48 crc kubenswrapper[4907]: I1129 14:48:48.518956 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-f5c5f9868-d5gtz" Nov 29 14:48:51 crc kubenswrapper[4907]: I1129 14:48:51.462888 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-kqbx2" Nov 29 14:48:51 crc kubenswrapper[4907]: I1129 14:48:51.489204 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-brs2h" Nov 29 14:48:52 crc kubenswrapper[4907]: I1129 14:48:52.095333 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-xqw5l" Nov 29 14:48:52 crc kubenswrapper[4907]: I1129 14:48:52.182270 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-kqq5w" Nov 29 14:48:52 crc kubenswrapper[4907]: I1129 14:48:52.320072 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p8qdf" Nov 29 14:48:52 crc kubenswrapper[4907]: I1129 14:48:52.452396 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-fdjx8" Nov 29 14:48:52 crc kubenswrapper[4907]: I1129 14:48:52.559148 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-86bbb9c7fb-ldhkh" Nov 29 14:48:52 crc kubenswrapper[4907]: I1129 14:48:52.565431 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6h9pm" Nov 29 14:48:52 crc kubenswrapper[4907]: I1129 14:48:52.579489 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-h9tkn" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.302622 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-h69pd"] Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.311245 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-h69pd" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.312814 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-h69pd"] Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.314108 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-6wjjf" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.314366 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.315914 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.316094 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.372511 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d866421e-a132-4651-8452-f6cf2af40410-config\") pod \"dnsmasq-dns-675f4bcbfc-h69pd\" (UID: \"d866421e-a132-4651-8452-f6cf2af40410\") " pod="openstack/dnsmasq-dns-675f4bcbfc-h69pd" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.372566 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjf77\" (UniqueName: \"kubernetes.io/projected/d866421e-a132-4651-8452-f6cf2af40410-kube-api-access-xjf77\") pod \"dnsmasq-dns-675f4bcbfc-h69pd\" (UID: \"d866421e-a132-4651-8452-f6cf2af40410\") " pod="openstack/dnsmasq-dns-675f4bcbfc-h69pd" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.383860 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w9lr9"] Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.386191 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.389061 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.394281 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w9lr9"] Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.475107 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjf77\" (UniqueName: \"kubernetes.io/projected/d866421e-a132-4651-8452-f6cf2af40410-kube-api-access-xjf77\") pod \"dnsmasq-dns-675f4bcbfc-h69pd\" (UID: \"d866421e-a132-4651-8452-f6cf2af40410\") " pod="openstack/dnsmasq-dns-675f4bcbfc-h69pd" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.475336 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-config\") pod \"dnsmasq-dns-78dd6ddcc-w9lr9\" (UID: \"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.475372 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-w9lr9\" (UID: \"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.475420 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk9zs\" (UniqueName: \"kubernetes.io/projected/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-kube-api-access-qk9zs\") pod \"dnsmasq-dns-78dd6ddcc-w9lr9\" (UID: \"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.475518 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d866421e-a132-4651-8452-f6cf2af40410-config\") pod \"dnsmasq-dns-675f4bcbfc-h69pd\" (UID: \"d866421e-a132-4651-8452-f6cf2af40410\") " pod="openstack/dnsmasq-dns-675f4bcbfc-h69pd" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.476604 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d866421e-a132-4651-8452-f6cf2af40410-config\") pod \"dnsmasq-dns-675f4bcbfc-h69pd\" (UID: \"d866421e-a132-4651-8452-f6cf2af40410\") " pod="openstack/dnsmasq-dns-675f4bcbfc-h69pd" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.494246 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjf77\" (UniqueName: \"kubernetes.io/projected/d866421e-a132-4651-8452-f6cf2af40410-kube-api-access-xjf77\") pod \"dnsmasq-dns-675f4bcbfc-h69pd\" (UID: \"d866421e-a132-4651-8452-f6cf2af40410\") " pod="openstack/dnsmasq-dns-675f4bcbfc-h69pd" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.577472 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-config\") pod \"dnsmasq-dns-78dd6ddcc-w9lr9\" (UID: \"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.577543 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-w9lr9\" (UID: \"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.577583 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk9zs\" (UniqueName: \"kubernetes.io/projected/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-kube-api-access-qk9zs\") pod \"dnsmasq-dns-78dd6ddcc-w9lr9\" (UID: \"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.578817 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-w9lr9\" (UID: \"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.580026 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-config\") pod \"dnsmasq-dns-78dd6ddcc-w9lr9\" (UID: \"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.597353 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk9zs\" (UniqueName: \"kubernetes.io/projected/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-kube-api-access-qk9zs\") pod \"dnsmasq-dns-78dd6ddcc-w9lr9\" (UID: \"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.637186 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-h69pd" Nov 29 14:49:08 crc kubenswrapper[4907]: I1129 14:49:08.712297 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" Nov 29 14:49:09 crc kubenswrapper[4907]: I1129 14:49:09.117656 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-h69pd"] Nov 29 14:49:09 crc kubenswrapper[4907]: W1129 14:49:09.121930 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd866421e_a132_4651_8452_f6cf2af40410.slice/crio-ba98db32292acb6db1db7c82f0a5f7badd3484d246ee6652af79bd90abdbe869 WatchSource:0}: Error finding container ba98db32292acb6db1db7c82f0a5f7badd3484d246ee6652af79bd90abdbe869: Status 404 returned error can't find the container with id ba98db32292acb6db1db7c82f0a5f7badd3484d246ee6652af79bd90abdbe869 Nov 29 14:49:09 crc kubenswrapper[4907]: I1129 14:49:09.283112 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w9lr9"] Nov 29 14:49:09 crc kubenswrapper[4907]: W1129 14:49:09.283578 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8d62e8f_7e5c_4fe1_b06d_4499b10075e3.slice/crio-613a9d0747ecf2e820d4bcb7c62b3996f92fd44f7430efedddc739ced94eedc8 WatchSource:0}: Error finding container 613a9d0747ecf2e820d4bcb7c62b3996f92fd44f7430efedddc739ced94eedc8: Status 404 returned error can't find the container with id 613a9d0747ecf2e820d4bcb7c62b3996f92fd44f7430efedddc739ced94eedc8 Nov 29 14:49:09 crc kubenswrapper[4907]: I1129 14:49:09.417947 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" event={"ID":"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3","Type":"ContainerStarted","Data":"613a9d0747ecf2e820d4bcb7c62b3996f92fd44f7430efedddc739ced94eedc8"} Nov 29 14:49:09 crc kubenswrapper[4907]: I1129 14:49:09.419497 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-h69pd" event={"ID":"d866421e-a132-4651-8452-f6cf2af40410","Type":"ContainerStarted","Data":"ba98db32292acb6db1db7c82f0a5f7badd3484d246ee6652af79bd90abdbe869"} Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.344825 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-h69pd"] Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.371288 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dfdx9"] Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.372816 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.401860 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dfdx9"] Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.430120 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37167587-89bd-4132-89b5-c71426846175-config\") pod \"dnsmasq-dns-666b6646f7-dfdx9\" (UID: \"37167587-89bd-4132-89b5-c71426846175\") " pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.430172 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhpcm\" (UniqueName: \"kubernetes.io/projected/37167587-89bd-4132-89b5-c71426846175-kube-api-access-mhpcm\") pod \"dnsmasq-dns-666b6646f7-dfdx9\" (UID: \"37167587-89bd-4132-89b5-c71426846175\") " pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.430218 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37167587-89bd-4132-89b5-c71426846175-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dfdx9\" (UID: \"37167587-89bd-4132-89b5-c71426846175\") " pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.532449 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37167587-89bd-4132-89b5-c71426846175-config\") pod \"dnsmasq-dns-666b6646f7-dfdx9\" (UID: \"37167587-89bd-4132-89b5-c71426846175\") " pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.532496 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhpcm\" (UniqueName: \"kubernetes.io/projected/37167587-89bd-4132-89b5-c71426846175-kube-api-access-mhpcm\") pod \"dnsmasq-dns-666b6646f7-dfdx9\" (UID: \"37167587-89bd-4132-89b5-c71426846175\") " pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.532950 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37167587-89bd-4132-89b5-c71426846175-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dfdx9\" (UID: \"37167587-89bd-4132-89b5-c71426846175\") " pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.534027 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37167587-89bd-4132-89b5-c71426846175-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dfdx9\" (UID: \"37167587-89bd-4132-89b5-c71426846175\") " pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.534091 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37167587-89bd-4132-89b5-c71426846175-config\") pod \"dnsmasq-dns-666b6646f7-dfdx9\" (UID: \"37167587-89bd-4132-89b5-c71426846175\") " pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.583074 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhpcm\" (UniqueName: \"kubernetes.io/projected/37167587-89bd-4132-89b5-c71426846175-kube-api-access-mhpcm\") pod \"dnsmasq-dns-666b6646f7-dfdx9\" (UID: \"37167587-89bd-4132-89b5-c71426846175\") " pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.689078 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w9lr9"] Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.704682 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.714911 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hrnbt"] Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.717243 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.733565 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hrnbt"] Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.852458 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmkh5\" (UniqueName: \"kubernetes.io/projected/5dbd4b99-a473-4c9b-a372-492c3ae34064-kube-api-access-cmkh5\") pod \"dnsmasq-dns-57d769cc4f-hrnbt\" (UID: \"5dbd4b99-a473-4c9b-a372-492c3ae34064\") " pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.852802 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5dbd4b99-a473-4c9b-a372-492c3ae34064-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hrnbt\" (UID: \"5dbd4b99-a473-4c9b-a372-492c3ae34064\") " pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.853108 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dbd4b99-a473-4c9b-a372-492c3ae34064-config\") pod \"dnsmasq-dns-57d769cc4f-hrnbt\" (UID: \"5dbd4b99-a473-4c9b-a372-492c3ae34064\") " pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.955080 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmkh5\" (UniqueName: \"kubernetes.io/projected/5dbd4b99-a473-4c9b-a372-492c3ae34064-kube-api-access-cmkh5\") pod \"dnsmasq-dns-57d769cc4f-hrnbt\" (UID: \"5dbd4b99-a473-4c9b-a372-492c3ae34064\") " pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.955823 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5dbd4b99-a473-4c9b-a372-492c3ae34064-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hrnbt\" (UID: \"5dbd4b99-a473-4c9b-a372-492c3ae34064\") " pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.955892 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dbd4b99-a473-4c9b-a372-492c3ae34064-config\") pod \"dnsmasq-dns-57d769cc4f-hrnbt\" (UID: \"5dbd4b99-a473-4c9b-a372-492c3ae34064\") " pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.956804 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dbd4b99-a473-4c9b-a372-492c3ae34064-config\") pod \"dnsmasq-dns-57d769cc4f-hrnbt\" (UID: \"5dbd4b99-a473-4c9b-a372-492c3ae34064\") " pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" Nov 29 14:49:11 crc kubenswrapper[4907]: I1129 14:49:11.956854 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5dbd4b99-a473-4c9b-a372-492c3ae34064-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hrnbt\" (UID: \"5dbd4b99-a473-4c9b-a372-492c3ae34064\") " pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.007669 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmkh5\" (UniqueName: \"kubernetes.io/projected/5dbd4b99-a473-4c9b-a372-492c3ae34064-kube-api-access-cmkh5\") pod \"dnsmasq-dns-57d769cc4f-hrnbt\" (UID: \"5dbd4b99-a473-4c9b-a372-492c3ae34064\") " pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.133853 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.351455 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dfdx9"] Nov 29 14:49:12 crc kubenswrapper[4907]: W1129 14:49:12.359649 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37167587_89bd_4132_89b5_c71426846175.slice/crio-953a921c795234092ac3748e94973184f5f1bd9638f3458619a904d20cfc1890 WatchSource:0}: Error finding container 953a921c795234092ac3748e94973184f5f1bd9638f3458619a904d20cfc1890: Status 404 returned error can't find the container with id 953a921c795234092ac3748e94973184f5f1bd9638f3458619a904d20cfc1890 Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.472883 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" event={"ID":"37167587-89bd-4132-89b5-c71426846175","Type":"ContainerStarted","Data":"953a921c795234092ac3748e94973184f5f1bd9638f3458619a904d20cfc1890"} Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.510604 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.513144 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.516001 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.516020 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lpglp" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.516201 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.516340 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.516360 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.519096 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.519502 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.543524 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.583777 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.583849 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/550beb06-c1d9-4568-bb4e-66ff9134cb8e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.583873 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/550beb06-c1d9-4568-bb4e-66ff9134cb8e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.583897 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.583975 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.584056 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.584169 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.584189 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.584224 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqdqh\" (UniqueName: \"kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-kube-api-access-gqdqh\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.584300 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.584333 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-config-data\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.693303 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqdqh\" (UniqueName: \"kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-kube-api-access-gqdqh\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.694750 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.694829 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-config-data\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.694900 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.694949 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/550beb06-c1d9-4568-bb4e-66ff9134cb8e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.695031 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/550beb06-c1d9-4568-bb4e-66ff9134cb8e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.695099 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.695152 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.695204 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.695266 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.695285 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.695414 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.696242 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.696644 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.696987 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.697118 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.698077 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-config-data\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.704284 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.707258 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/550beb06-c1d9-4568-bb4e-66ff9134cb8e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.718266 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqdqh\" (UniqueName: \"kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-kube-api-access-gqdqh\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: W1129 14:49:12.718921 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dbd4b99_a473_4c9b_a372_492c3ae34064.slice/crio-f4f825a49a46a56d9224b516811d9a0db9e3eae7571e232accb7a9a6e804284a WatchSource:0}: Error finding container f4f825a49a46a56d9224b516811d9a0db9e3eae7571e232accb7a9a6e804284a: Status 404 returned error can't find the container with id f4f825a49a46a56d9224b516811d9a0db9e3eae7571e232accb7a9a6e804284a Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.722768 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.724339 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/550beb06-c1d9-4568-bb4e-66ff9134cb8e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.724529 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hrnbt"] Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.729221 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.834076 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.862030 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.880053 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.884370 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.884867 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.885105 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.885748 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.886588 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-qjtjp" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.887181 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.891090 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 29 14:49:12 crc kubenswrapper[4907]: I1129 14:49:12.910160 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.014164 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8aee0179-2960-486d-8129-1e928d55a29f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.014245 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5whq\" (UniqueName: \"kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-kube-api-access-p5whq\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.014402 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.014581 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.014626 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.014651 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.014707 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.014809 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.015428 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.015487 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8aee0179-2960-486d-8129-1e928d55a29f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.015516 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.116565 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.116613 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.116634 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.116652 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.116679 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.116717 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.116736 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.116754 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8aee0179-2960-486d-8129-1e928d55a29f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.116770 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.116836 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8aee0179-2960-486d-8129-1e928d55a29f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.116860 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5whq\" (UniqueName: \"kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-kube-api-access-p5whq\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.117746 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.117782 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.117968 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.118188 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.119777 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.121895 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.122866 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8aee0179-2960-486d-8129-1e928d55a29f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.123428 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.125367 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.132371 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8aee0179-2960-486d-8129-1e928d55a29f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.137555 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5whq\" (UniqueName: \"kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-kube-api-access-p5whq\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.169721 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.302028 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:49:13 crc kubenswrapper[4907]: W1129 14:49:13.423014 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod550beb06_c1d9_4568_bb4e_66ff9134cb8e.slice/crio-d1435dba1ce494ba69ef6bc36ef7a9c35cde74d6a14652960b3de681cc797049 WatchSource:0}: Error finding container d1435dba1ce494ba69ef6bc36ef7a9c35cde74d6a14652960b3de681cc797049: Status 404 returned error can't find the container with id d1435dba1ce494ba69ef6bc36ef7a9c35cde74d6a14652960b3de681cc797049 Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.428504 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.485388 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"550beb06-c1d9-4568-bb4e-66ff9134cb8e","Type":"ContainerStarted","Data":"d1435dba1ce494ba69ef6bc36ef7a9c35cde74d6a14652960b3de681cc797049"} Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.486598 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" event={"ID":"5dbd4b99-a473-4c9b-a372-492c3ae34064","Type":"ContainerStarted","Data":"f4f825a49a46a56d9224b516811d9a0db9e3eae7571e232accb7a9a6e804284a"} Nov 29 14:49:13 crc kubenswrapper[4907]: I1129 14:49:13.834826 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.155361 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.167035 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.179954 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.180107 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-49c85" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.186382 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.195100 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.195299 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.202756 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.248065 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b4712d7-a81b-455f-841a-a0ca14eafcbe-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.248127 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b4712d7-a81b-455f-841a-a0ca14eafcbe-kolla-config\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.248165 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.248233 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wvbh\" (UniqueName: \"kubernetes.io/projected/2b4712d7-a81b-455f-841a-a0ca14eafcbe-kube-api-access-9wvbh\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.248261 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4712d7-a81b-455f-841a-a0ca14eafcbe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.248280 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b4712d7-a81b-455f-841a-a0ca14eafcbe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.248324 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4712d7-a81b-455f-841a-a0ca14eafcbe-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.248339 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b4712d7-a81b-455f-841a-a0ca14eafcbe-config-data-default\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.355105 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4712d7-a81b-455f-841a-a0ca14eafcbe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.355171 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b4712d7-a81b-455f-841a-a0ca14eafcbe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.355238 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4712d7-a81b-455f-841a-a0ca14eafcbe-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.355259 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b4712d7-a81b-455f-841a-a0ca14eafcbe-config-data-default\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.355321 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b4712d7-a81b-455f-841a-a0ca14eafcbe-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.355362 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b4712d7-a81b-455f-841a-a0ca14eafcbe-kolla-config\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.355411 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.355525 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wvbh\" (UniqueName: \"kubernetes.io/projected/2b4712d7-a81b-455f-841a-a0ca14eafcbe-kube-api-access-9wvbh\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.357721 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b4712d7-a81b-455f-841a-a0ca14eafcbe-config-data-default\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.358526 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b4712d7-a81b-455f-841a-a0ca14eafcbe-kolla-config\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.359622 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.362775 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b4712d7-a81b-455f-841a-a0ca14eafcbe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.366695 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4712d7-a81b-455f-841a-a0ca14eafcbe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.371297 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4712d7-a81b-455f-841a-a0ca14eafcbe-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.386051 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b4712d7-a81b-455f-841a-a0ca14eafcbe-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.400595 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wvbh\" (UniqueName: \"kubernetes.io/projected/2b4712d7-a81b-455f-841a-a0ca14eafcbe-kube-api-access-9wvbh\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.427555 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"2b4712d7-a81b-455f-841a-a0ca14eafcbe\") " pod="openstack/openstack-galera-0" Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.512406 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8aee0179-2960-486d-8129-1e928d55a29f","Type":"ContainerStarted","Data":"404850aa49bc77f573aac1632419270aa1b073880dce5acd45edd4eb16ba0358"} Nov 29 14:49:14 crc kubenswrapper[4907]: I1129 14:49:14.530594 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.477833 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.485343 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.485548 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.487518 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-nnlj7" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.490368 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.502472 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.502606 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.687517 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9f58cda-06bb-41f5-b91d-fdf10dab6164-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.687559 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e9f58cda-06bb-41f5-b91d-fdf10dab6164-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.687595 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfkt9\" (UniqueName: \"kubernetes.io/projected/e9f58cda-06bb-41f5-b91d-fdf10dab6164-kube-api-access-pfkt9\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.687629 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e9f58cda-06bb-41f5-b91d-fdf10dab6164-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.687662 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f58cda-06bb-41f5-b91d-fdf10dab6164-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.687730 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9f58cda-06bb-41f5-b91d-fdf10dab6164-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.687745 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e9f58cda-06bb-41f5-b91d-fdf10dab6164-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.687764 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.690624 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.691809 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.693648 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-gq6ph" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.694265 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.698539 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.706292 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.793062 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/828d6b05-06be-4157-8163-96a3220fedb0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"828d6b05-06be-4157-8163-96a3220fedb0\") " pod="openstack/memcached-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.793106 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/828d6b05-06be-4157-8163-96a3220fedb0-config-data\") pod \"memcached-0\" (UID: \"828d6b05-06be-4157-8163-96a3220fedb0\") " pod="openstack/memcached-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.793146 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9f58cda-06bb-41f5-b91d-fdf10dab6164-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.793168 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e9f58cda-06bb-41f5-b91d-fdf10dab6164-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.793186 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk7lg\" (UniqueName: \"kubernetes.io/projected/828d6b05-06be-4157-8163-96a3220fedb0-kube-api-access-lk7lg\") pod \"memcached-0\" (UID: \"828d6b05-06be-4157-8163-96a3220fedb0\") " pod="openstack/memcached-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.793216 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfkt9\" (UniqueName: \"kubernetes.io/projected/e9f58cda-06bb-41f5-b91d-fdf10dab6164-kube-api-access-pfkt9\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.793242 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/828d6b05-06be-4157-8163-96a3220fedb0-kolla-config\") pod \"memcached-0\" (UID: \"828d6b05-06be-4157-8163-96a3220fedb0\") " pod="openstack/memcached-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.793268 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e9f58cda-06bb-41f5-b91d-fdf10dab6164-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.793298 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f58cda-06bb-41f5-b91d-fdf10dab6164-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.793365 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828d6b05-06be-4157-8163-96a3220fedb0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"828d6b05-06be-4157-8163-96a3220fedb0\") " pod="openstack/memcached-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.793391 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9f58cda-06bb-41f5-b91d-fdf10dab6164-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.793409 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.793424 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e9f58cda-06bb-41f5-b91d-fdf10dab6164-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.794385 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e9f58cda-06bb-41f5-b91d-fdf10dab6164-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.794448 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e9f58cda-06bb-41f5-b91d-fdf10dab6164-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.796391 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9f58cda-06bb-41f5-b91d-fdf10dab6164-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.796445 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.799124 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e9f58cda-06bb-41f5-b91d-fdf10dab6164-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.801119 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f58cda-06bb-41f5-b91d-fdf10dab6164-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.811136 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9f58cda-06bb-41f5-b91d-fdf10dab6164-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.815367 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfkt9\" (UniqueName: \"kubernetes.io/projected/e9f58cda-06bb-41f5-b91d-fdf10dab6164-kube-api-access-pfkt9\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.872581 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e9f58cda-06bb-41f5-b91d-fdf10dab6164\") " pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.895733 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828d6b05-06be-4157-8163-96a3220fedb0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"828d6b05-06be-4157-8163-96a3220fedb0\") " pod="openstack/memcached-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.895803 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/828d6b05-06be-4157-8163-96a3220fedb0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"828d6b05-06be-4157-8163-96a3220fedb0\") " pod="openstack/memcached-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.895826 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/828d6b05-06be-4157-8163-96a3220fedb0-config-data\") pod \"memcached-0\" (UID: \"828d6b05-06be-4157-8163-96a3220fedb0\") " pod="openstack/memcached-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.895861 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk7lg\" (UniqueName: \"kubernetes.io/projected/828d6b05-06be-4157-8163-96a3220fedb0-kube-api-access-lk7lg\") pod \"memcached-0\" (UID: \"828d6b05-06be-4157-8163-96a3220fedb0\") " pod="openstack/memcached-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.895892 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/828d6b05-06be-4157-8163-96a3220fedb0-kolla-config\") pod \"memcached-0\" (UID: \"828d6b05-06be-4157-8163-96a3220fedb0\") " pod="openstack/memcached-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.896642 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/828d6b05-06be-4157-8163-96a3220fedb0-kolla-config\") pod \"memcached-0\" (UID: \"828d6b05-06be-4157-8163-96a3220fedb0\") " pod="openstack/memcached-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.897789 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/828d6b05-06be-4157-8163-96a3220fedb0-config-data\") pod \"memcached-0\" (UID: \"828d6b05-06be-4157-8163-96a3220fedb0\") " pod="openstack/memcached-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.903355 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/828d6b05-06be-4157-8163-96a3220fedb0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"828d6b05-06be-4157-8163-96a3220fedb0\") " pod="openstack/memcached-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.909393 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828d6b05-06be-4157-8163-96a3220fedb0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"828d6b05-06be-4157-8163-96a3220fedb0\") " pod="openstack/memcached-0" Nov 29 14:49:15 crc kubenswrapper[4907]: I1129 14:49:15.939791 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk7lg\" (UniqueName: \"kubernetes.io/projected/828d6b05-06be-4157-8163-96a3220fedb0-kube-api-access-lk7lg\") pod \"memcached-0\" (UID: \"828d6b05-06be-4157-8163-96a3220fedb0\") " pod="openstack/memcached-0" Nov 29 14:49:16 crc kubenswrapper[4907]: I1129 14:49:16.021889 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 29 14:49:16 crc kubenswrapper[4907]: I1129 14:49:16.149125 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:17 crc kubenswrapper[4907]: I1129 14:49:17.799265 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 14:49:17 crc kubenswrapper[4907]: I1129 14:49:17.800832 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 14:49:17 crc kubenswrapper[4907]: I1129 14:49:17.807172 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-vcs9l" Nov 29 14:49:17 crc kubenswrapper[4907]: I1129 14:49:17.843209 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 14:49:17 crc kubenswrapper[4907]: I1129 14:49:17.937460 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mqk5\" (UniqueName: \"kubernetes.io/projected/ce177523-3519-4f04-b71c-7869b8bb5810-kube-api-access-6mqk5\") pod \"kube-state-metrics-0\" (UID: \"ce177523-3519-4f04-b71c-7869b8bb5810\") " pod="openstack/kube-state-metrics-0" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.040639 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mqk5\" (UniqueName: \"kubernetes.io/projected/ce177523-3519-4f04-b71c-7869b8bb5810-kube-api-access-6mqk5\") pod \"kube-state-metrics-0\" (UID: \"ce177523-3519-4f04-b71c-7869b8bb5810\") " pod="openstack/kube-state-metrics-0" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.075302 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mqk5\" (UniqueName: \"kubernetes.io/projected/ce177523-3519-4f04-b71c-7869b8bb5810-kube-api-access-6mqk5\") pod \"kube-state-metrics-0\" (UID: \"ce177523-3519-4f04-b71c-7869b8bb5810\") " pod="openstack/kube-state-metrics-0" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.176130 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.547015 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-v6mrf"] Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.550645 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-v6mrf" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.552426 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.556700 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-v6mrf"] Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.560857 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-z9f7x" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.656507 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh9w4\" (UniqueName: \"kubernetes.io/projected/985db950-2dae-4f2f-8ea4-289b661b1481-kube-api-access-zh9w4\") pod \"observability-ui-dashboards-7d5fb4cbfb-v6mrf\" (UID: \"985db950-2dae-4f2f-8ea4-289b661b1481\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-v6mrf" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.656657 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/985db950-2dae-4f2f-8ea4-289b661b1481-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-v6mrf\" (UID: \"985db950-2dae-4f2f-8ea4-289b661b1481\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-v6mrf" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.759720 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/985db950-2dae-4f2f-8ea4-289b661b1481-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-v6mrf\" (UID: \"985db950-2dae-4f2f-8ea4-289b661b1481\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-v6mrf" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.759831 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh9w4\" (UniqueName: \"kubernetes.io/projected/985db950-2dae-4f2f-8ea4-289b661b1481-kube-api-access-zh9w4\") pod \"observability-ui-dashboards-7d5fb4cbfb-v6mrf\" (UID: \"985db950-2dae-4f2f-8ea4-289b661b1481\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-v6mrf" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.766368 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/985db950-2dae-4f2f-8ea4-289b661b1481-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-v6mrf\" (UID: \"985db950-2dae-4f2f-8ea4-289b661b1481\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-v6mrf" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.793126 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh9w4\" (UniqueName: \"kubernetes.io/projected/985db950-2dae-4f2f-8ea4-289b661b1481-kube-api-access-zh9w4\") pod \"observability-ui-dashboards-7d5fb4cbfb-v6mrf\" (UID: \"985db950-2dae-4f2f-8ea4-289b661b1481\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-v6mrf" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.873925 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-v6mrf" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.878602 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7856986b97-4nnqk"] Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.879722 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.942230 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7856986b97-4nnqk"] Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.963360 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-oauth-serving-cert\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.963430 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-console-serving-cert\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.963505 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-console-oauth-config\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.963538 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmcrz\" (UniqueName: \"kubernetes.io/projected/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-kube-api-access-dmcrz\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.963562 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-service-ca\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.963588 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-console-config\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:18 crc kubenswrapper[4907]: I1129 14:49:18.963629 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-trusted-ca-bundle\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.066747 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-console-oauth-config\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.066793 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmcrz\" (UniqueName: \"kubernetes.io/projected/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-kube-api-access-dmcrz\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.066818 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-service-ca\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.066852 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-console-config\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.066897 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-trusted-ca-bundle\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.066934 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-oauth-serving-cert\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.066981 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-console-serving-cert\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.068061 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-oauth-serving-cert\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.068097 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-trusted-ca-bundle\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.068286 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-console-config\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.069448 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-service-ca\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.077099 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-console-serving-cert\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.083233 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmcrz\" (UniqueName: \"kubernetes.io/projected/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-kube-api-access-dmcrz\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.084387 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bff9f61a-b60d-444d-9aa0-1619b76bc7a9-console-oauth-config\") pod \"console-7856986b97-4nnqk\" (UID: \"bff9f61a-b60d-444d-9aa0-1619b76bc7a9\") " pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.127215 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.129626 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.135105 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.135290 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-psfxj" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.135635 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.135833 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.136333 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.138679 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.161408 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.209717 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.281346 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.281401 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.281435 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.281474 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.281508 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.281537 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.281560 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5jqt\" (UniqueName: \"kubernetes.io/projected/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-kube-api-access-m5jqt\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.281600 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.385559 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.385628 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.385659 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.385694 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.385728 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.385753 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5jqt\" (UniqueName: \"kubernetes.io/projected/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-kube-api-access-m5jqt\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.385792 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.385843 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.386284 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.386470 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.390735 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.391111 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.393002 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.393147 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.408301 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.411263 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5jqt\" (UniqueName: \"kubernetes.io/projected/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-kube-api-access-m5jqt\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.429975 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:19 crc kubenswrapper[4907]: I1129 14:49:19.454184 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 14:49:20 crc kubenswrapper[4907]: I1129 14:49:20.962724 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 29 14:49:20 crc kubenswrapper[4907]: I1129 14:49:20.964862 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:20 crc kubenswrapper[4907]: I1129 14:49:20.969066 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 29 14:49:20 crc kubenswrapper[4907]: I1129 14:49:20.969332 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 29 14:49:20 crc kubenswrapper[4907]: I1129 14:49:20.969585 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 29 14:49:20 crc kubenswrapper[4907]: I1129 14:49:20.969635 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dbptn" Nov 29 14:49:20 crc kubenswrapper[4907]: I1129 14:49:20.969834 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 29 14:49:20 crc kubenswrapper[4907]: I1129 14:49:20.983936 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.118939 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/76797dae-1bc6-4e63-824b-423fab640187-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.119008 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/76797dae-1bc6-4e63-824b-423fab640187-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.119047 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76797dae-1bc6-4e63-824b-423fab640187-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.119119 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.119150 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/76797dae-1bc6-4e63-824b-423fab640187-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.119194 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76797dae-1bc6-4e63-824b-423fab640187-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.119246 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ghbq\" (UniqueName: \"kubernetes.io/projected/76797dae-1bc6-4e63-824b-423fab640187-kube-api-access-5ghbq\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.119290 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76797dae-1bc6-4e63-824b-423fab640187-config\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.220883 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76797dae-1bc6-4e63-824b-423fab640187-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.220977 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ghbq\" (UniqueName: \"kubernetes.io/projected/76797dae-1bc6-4e63-824b-423fab640187-kube-api-access-5ghbq\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.221035 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76797dae-1bc6-4e63-824b-423fab640187-config\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.221092 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/76797dae-1bc6-4e63-824b-423fab640187-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.221123 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/76797dae-1bc6-4e63-824b-423fab640187-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.221166 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76797dae-1bc6-4e63-824b-423fab640187-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.221243 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.221280 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/76797dae-1bc6-4e63-824b-423fab640187-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.221767 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.221921 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/76797dae-1bc6-4e63-824b-423fab640187-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.222621 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76797dae-1bc6-4e63-824b-423fab640187-config\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.222694 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76797dae-1bc6-4e63-824b-423fab640187-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.226401 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/76797dae-1bc6-4e63-824b-423fab640187-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.226499 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/76797dae-1bc6-4e63-824b-423fab640187-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.233755 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76797dae-1bc6-4e63-824b-423fab640187-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.238008 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ghbq\" (UniqueName: \"kubernetes.io/projected/76797dae-1bc6-4e63-824b-423fab640187-kube-api-access-5ghbq\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.249642 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"76797dae-1bc6-4e63-824b-423fab640187\") " pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:21 crc kubenswrapper[4907]: I1129 14:49:21.294940 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:22 crc kubenswrapper[4907]: I1129 14:49:22.975245 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m4mz2"] Nov 29 14:49:22 crc kubenswrapper[4907]: I1129 14:49:22.977095 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:22 crc kubenswrapper[4907]: I1129 14:49:22.979107 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 29 14:49:22 crc kubenswrapper[4907]: I1129 14:49:22.980546 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 29 14:49:22 crc kubenswrapper[4907]: I1129 14:49:22.981274 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-lfklp" Nov 29 14:49:22 crc kubenswrapper[4907]: I1129 14:49:22.996841 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m4mz2"] Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.032306 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-ndrfl"] Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.034904 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.040130 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ndrfl"] Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.072916 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/33f5965b-43ae-484d-9c5c-1a54ae4de6da-var-log-ovn\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.072960 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbcww\" (UniqueName: \"kubernetes.io/projected/33f5965b-43ae-484d-9c5c-1a54ae4de6da-kube-api-access-rbcww\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.072996 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f5965b-43ae-484d-9c5c-1a54ae4de6da-combined-ca-bundle\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.073077 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/33f5965b-43ae-484d-9c5c-1a54ae4de6da-ovn-controller-tls-certs\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.073131 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/33f5965b-43ae-484d-9c5c-1a54ae4de6da-var-run\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.073190 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/33f5965b-43ae-484d-9c5c-1a54ae4de6da-var-run-ovn\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.073228 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33f5965b-43ae-484d-9c5c-1a54ae4de6da-scripts\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.174528 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-etc-ovs\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.174586 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/33f5965b-43ae-484d-9c5c-1a54ae4de6da-ovn-controller-tls-certs\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.174617 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-scripts\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.174641 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/33f5965b-43ae-484d-9c5c-1a54ae4de6da-var-run\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.174682 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/33f5965b-43ae-484d-9c5c-1a54ae4de6da-var-run-ovn\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.174717 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33f5965b-43ae-484d-9c5c-1a54ae4de6da-scripts\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.174754 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-var-run\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.174781 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/33f5965b-43ae-484d-9c5c-1a54ae4de6da-var-log-ovn\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.174805 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbcww\" (UniqueName: \"kubernetes.io/projected/33f5965b-43ae-484d-9c5c-1a54ae4de6da-kube-api-access-rbcww\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.174834 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f5965b-43ae-484d-9c5c-1a54ae4de6da-combined-ca-bundle\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.174856 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6szd\" (UniqueName: \"kubernetes.io/projected/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-kube-api-access-h6szd\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.174876 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-var-log\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.174896 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-var-lib\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.180789 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f5965b-43ae-484d-9c5c-1a54ae4de6da-combined-ca-bundle\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.181695 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33f5965b-43ae-484d-9c5c-1a54ae4de6da-scripts\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.182522 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/33f5965b-43ae-484d-9c5c-1a54ae4de6da-var-run-ovn\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.182552 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/33f5965b-43ae-484d-9c5c-1a54ae4de6da-var-run\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.184647 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/33f5965b-43ae-484d-9c5c-1a54ae4de6da-var-log-ovn\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.186122 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/33f5965b-43ae-484d-9c5c-1a54ae4de6da-ovn-controller-tls-certs\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.202194 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbcww\" (UniqueName: \"kubernetes.io/projected/33f5965b-43ae-484d-9c5c-1a54ae4de6da-kube-api-access-rbcww\") pod \"ovn-controller-m4mz2\" (UID: \"33f5965b-43ae-484d-9c5c-1a54ae4de6da\") " pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.276135 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6szd\" (UniqueName: \"kubernetes.io/projected/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-kube-api-access-h6szd\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.276200 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-var-log\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.276230 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-var-lib\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.276278 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-etc-ovs\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.276325 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-scripts\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.276493 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-var-run\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.276610 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-var-run\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.277115 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-var-log\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.277280 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-var-lib\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.277417 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-etc-ovs\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.279700 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-scripts\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.302063 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6szd\" (UniqueName: \"kubernetes.io/projected/b3d93208-e155-4746-bfd4-2d6d7d04dc2e-kube-api-access-h6szd\") pod \"ovn-controller-ovs-ndrfl\" (UID: \"b3d93208-e155-4746-bfd4-2d6d7d04dc2e\") " pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.326351 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:23 crc kubenswrapper[4907]: I1129 14:49:23.357511 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:24 crc kubenswrapper[4907]: I1129 14:49:24.362511 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 29 14:49:24 crc kubenswrapper[4907]: I1129 14:49:24.847224 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 29 14:49:24 crc kubenswrapper[4907]: I1129 14:49:24.850605 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:24 crc kubenswrapper[4907]: I1129 14:49:24.855150 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 29 14:49:24 crc kubenswrapper[4907]: I1129 14:49:24.855250 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 29 14:49:24 crc kubenswrapper[4907]: I1129 14:49:24.856426 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 29 14:49:24 crc kubenswrapper[4907]: I1129 14:49:24.856794 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-zcxxw" Nov 29 14:49:24 crc kubenswrapper[4907]: I1129 14:49:24.868423 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.020255 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.020459 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v8nv\" (UniqueName: \"kubernetes.io/projected/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-kube-api-access-7v8nv\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.020715 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.020758 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.020831 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-config\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.020995 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.021078 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.021125 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.125628 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.125726 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.125774 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v8nv\" (UniqueName: \"kubernetes.io/projected/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-kube-api-access-7v8nv\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.125862 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.125891 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.125930 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-config\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.125999 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.126036 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.126370 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.130663 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.131248 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.132108 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-config\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.142151 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.142674 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.143517 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.148997 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v8nv\" (UniqueName: \"kubernetes.io/projected/7c6489ed-0658-49ec-8ae2-a43a8cf795ef-kube-api-access-7v8nv\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.164704 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7c6489ed-0658-49ec-8ae2-a43a8cf795ef\") " pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:25 crc kubenswrapper[4907]: I1129 14:49:25.174792 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:29 crc kubenswrapper[4907]: I1129 14:49:29.602770 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 14:49:29 crc kubenswrapper[4907]: I1129 14:49:29.713655 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b4712d7-a81b-455f-841a-a0ca14eafcbe","Type":"ContainerStarted","Data":"579c35bcf4a9434cb3379263bec365d2e0367de57d5733714b09b6307c3e2cfe"} Nov 29 14:49:30 crc kubenswrapper[4907]: W1129 14:49:30.021379 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce177523_3519_4f04_b71c_7869b8bb5810.slice/crio-e0c6a10f72f299b01eeb9c055ed134aaa1ed211f1eb60bedca5dca5f580d06f5 WatchSource:0}: Error finding container e0c6a10f72f299b01eeb9c055ed134aaa1ed211f1eb60bedca5dca5f580d06f5: Status 404 returned error can't find the container with id e0c6a10f72f299b01eeb9c055ed134aaa1ed211f1eb60bedca5dca5f580d06f5 Nov 29 14:49:30 crc kubenswrapper[4907]: E1129 14:49:30.066188 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 29 14:49:30 crc kubenswrapper[4907]: E1129 14:49:30.066359 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xjf77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-h69pd_openstack(d866421e-a132-4651-8452-f6cf2af40410): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:49:30 crc kubenswrapper[4907]: E1129 14:49:30.067561 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-h69pd" podUID="d866421e-a132-4651-8452-f6cf2af40410" Nov 29 14:49:30 crc kubenswrapper[4907]: E1129 14:49:30.105832 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 29 14:49:30 crc kubenswrapper[4907]: E1129 14:49:30.105960 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qk9zs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-w9lr9_openstack(c8d62e8f-7e5c-4fe1-b06d-4499b10075e3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:49:30 crc kubenswrapper[4907]: E1129 14:49:30.109197 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 29 14:49:30 crc kubenswrapper[4907]: E1129 14:49:30.109429 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cmkh5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-hrnbt_openstack(5dbd4b99-a473-4c9b-a372-492c3ae34064): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:49:30 crc kubenswrapper[4907]: E1129 14:49:30.109788 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" podUID="c8d62e8f-7e5c-4fe1-b06d-4499b10075e3" Nov 29 14:49:30 crc kubenswrapper[4907]: E1129 14:49:30.112811 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" podUID="5dbd4b99-a473-4c9b-a372-492c3ae34064" Nov 29 14:49:30 crc kubenswrapper[4907]: E1129 14:49:30.135575 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 29 14:49:30 crc kubenswrapper[4907]: E1129 14:49:30.136344 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mhpcm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-dfdx9_openstack(37167587-89bd-4132-89b5-c71426846175): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:49:30 crc kubenswrapper[4907]: E1129 14:49:30.137610 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" podUID="37167587-89bd-4132-89b5-c71426846175" Nov 29 14:49:30 crc kubenswrapper[4907]: I1129 14:49:30.738138 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ce177523-3519-4f04-b71c-7869b8bb5810","Type":"ContainerStarted","Data":"e0c6a10f72f299b01eeb9c055ed134aaa1ed211f1eb60bedca5dca5f580d06f5"} Nov 29 14:49:30 crc kubenswrapper[4907]: E1129 14:49:30.740766 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" podUID="37167587-89bd-4132-89b5-c71426846175" Nov 29 14:49:30 crc kubenswrapper[4907]: E1129 14:49:30.740791 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" podUID="5dbd4b99-a473-4c9b-a372-492c3ae34064" Nov 29 14:49:30 crc kubenswrapper[4907]: I1129 14:49:30.908994 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 14:49:30 crc kubenswrapper[4907]: I1129 14:49:30.921354 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7856986b97-4nnqk"] Nov 29 14:49:30 crc kubenswrapper[4907]: I1129 14:49:30.928545 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-v6mrf"] Nov 29 14:49:30 crc kubenswrapper[4907]: I1129 14:49:30.934909 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.390672 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.398812 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m4mz2"] Nov 29 14:49:31 crc kubenswrapper[4907]: W1129 14:49:31.551426 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d2e724b_ab17_45b3_a5ec_c43bf54e935d.slice/crio-cb0ace32345c48af82b8ee4bfd9550cf5176ba5b3881e7027032b8abd3746ba0 WatchSource:0}: Error finding container cb0ace32345c48af82b8ee4bfd9550cf5176ba5b3881e7027032b8abd3746ba0: Status 404 returned error can't find the container with id cb0ace32345c48af82b8ee4bfd9550cf5176ba5b3881e7027032b8abd3746ba0 Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.576785 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 29 14:49:31 crc kubenswrapper[4907]: W1129 14:49:31.652667 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33f5965b_43ae_484d_9c5c_1a54ae4de6da.slice/crio-8ae753ca0d691308cda1de15d697d3e58a1a309fe284eb5ab2bb0df76fb48c3e WatchSource:0}: Error finding container 8ae753ca0d691308cda1de15d697d3e58a1a309fe284eb5ab2bb0df76fb48c3e: Status 404 returned error can't find the container with id 8ae753ca0d691308cda1de15d697d3e58a1a309fe284eb5ab2bb0df76fb48c3e Nov 29 14:49:31 crc kubenswrapper[4907]: W1129 14:49:31.653807 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod828d6b05_06be_4157_8163_96a3220fedb0.slice/crio-8c7198533d96f35cf8240b0d9b33eb08f9c2982c65cf271295348aa68010edd8 WatchSource:0}: Error finding container 8c7198533d96f35cf8240b0d9b33eb08f9c2982c65cf271295348aa68010edd8: Status 404 returned error can't find the container with id 8c7198533d96f35cf8240b0d9b33eb08f9c2982c65cf271295348aa68010edd8 Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.750247 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-v6mrf" event={"ID":"985db950-2dae-4f2f-8ea4-289b661b1481","Type":"ContainerStarted","Data":"1e93b53286fa902640a20940376a988f38a34de24b51375baff61741f162a735"} Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.753294 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7856986b97-4nnqk" event={"ID":"bff9f61a-b60d-444d-9aa0-1619b76bc7a9","Type":"ContainerStarted","Data":"eecb2b43bc5c0cf828557af2b129d5544d0e492c8ea7f5216bcf27baf5961241"} Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.754469 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-h69pd" event={"ID":"d866421e-a132-4651-8452-f6cf2af40410","Type":"ContainerDied","Data":"ba98db32292acb6db1db7c82f0a5f7badd3484d246ee6652af79bd90abdbe869"} Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.754503 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba98db32292acb6db1db7c82f0a5f7badd3484d246ee6652af79bd90abdbe869" Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.755983 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"828d6b05-06be-4157-8163-96a3220fedb0","Type":"ContainerStarted","Data":"8c7198533d96f35cf8240b0d9b33eb08f9c2982c65cf271295348aa68010edd8"} Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.757873 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" event={"ID":"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3","Type":"ContainerDied","Data":"613a9d0747ecf2e820d4bcb7c62b3996f92fd44f7430efedddc739ced94eedc8"} Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.757908 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="613a9d0747ecf2e820d4bcb7c62b3996f92fd44f7430efedddc739ced94eedc8" Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.759125 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2e724b-ab17-45b3-a5ec-c43bf54e935d","Type":"ContainerStarted","Data":"cb0ace32345c48af82b8ee4bfd9550cf5176ba5b3881e7027032b8abd3746ba0"} Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.760147 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e9f58cda-06bb-41f5-b91d-fdf10dab6164","Type":"ContainerStarted","Data":"e3647d520488d8ec6743115c972e2ec4d5cb2a9d7cecd2659cf027596a8168e2"} Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.761829 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m4mz2" event={"ID":"33f5965b-43ae-484d-9c5c-1a54ae4de6da","Type":"ContainerStarted","Data":"8ae753ca0d691308cda1de15d697d3e58a1a309fe284eb5ab2bb0df76fb48c3e"} Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.823904 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-h69pd" Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.831537 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" Nov 29 14:49:31 crc kubenswrapper[4907]: W1129 14:49:31.855925 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76797dae_1bc6_4e63_824b_423fab640187.slice/crio-86cd83e03c748b6de365713e0b1132953d595577fd9e9bc6d680076e522c042d WatchSource:0}: Error finding container 86cd83e03c748b6de365713e0b1132953d595577fd9e9bc6d680076e522c042d: Status 404 returned error can't find the container with id 86cd83e03c748b6de365713e0b1132953d595577fd9e9bc6d680076e522c042d Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.989997 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-dns-svc\") pod \"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3\" (UID: \"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3\") " Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.990139 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk9zs\" (UniqueName: \"kubernetes.io/projected/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-kube-api-access-qk9zs\") pod \"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3\" (UID: \"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3\") " Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.990235 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-config\") pod \"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3\" (UID: \"c8d62e8f-7e5c-4fe1-b06d-4499b10075e3\") " Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.990337 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjf77\" (UniqueName: \"kubernetes.io/projected/d866421e-a132-4651-8452-f6cf2af40410-kube-api-access-xjf77\") pod \"d866421e-a132-4651-8452-f6cf2af40410\" (UID: \"d866421e-a132-4651-8452-f6cf2af40410\") " Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.990385 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d866421e-a132-4651-8452-f6cf2af40410-config\") pod \"d866421e-a132-4651-8452-f6cf2af40410\" (UID: \"d866421e-a132-4651-8452-f6cf2af40410\") " Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.990717 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c8d62e8f-7e5c-4fe1-b06d-4499b10075e3" (UID: "c8d62e8f-7e5c-4fe1-b06d-4499b10075e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.991248 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-config" (OuterVolumeSpecName: "config") pod "c8d62e8f-7e5c-4fe1-b06d-4499b10075e3" (UID: "c8d62e8f-7e5c-4fe1-b06d-4499b10075e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.991589 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.991613 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.992534 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d866421e-a132-4651-8452-f6cf2af40410-config" (OuterVolumeSpecName: "config") pod "d866421e-a132-4651-8452-f6cf2af40410" (UID: "d866421e-a132-4651-8452-f6cf2af40410"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.997649 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-kube-api-access-qk9zs" (OuterVolumeSpecName: "kube-api-access-qk9zs") pod "c8d62e8f-7e5c-4fe1-b06d-4499b10075e3" (UID: "c8d62e8f-7e5c-4fe1-b06d-4499b10075e3"). InnerVolumeSpecName "kube-api-access-qk9zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:49:31 crc kubenswrapper[4907]: I1129 14:49:31.998828 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d866421e-a132-4651-8452-f6cf2af40410-kube-api-access-xjf77" (OuterVolumeSpecName: "kube-api-access-xjf77") pod "d866421e-a132-4651-8452-f6cf2af40410" (UID: "d866421e-a132-4651-8452-f6cf2af40410"). InnerVolumeSpecName "kube-api-access-xjf77". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:49:32 crc kubenswrapper[4907]: I1129 14:49:32.083480 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 29 14:49:32 crc kubenswrapper[4907]: I1129 14:49:32.093102 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk9zs\" (UniqueName: \"kubernetes.io/projected/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3-kube-api-access-qk9zs\") on node \"crc\" DevicePath \"\"" Nov 29 14:49:32 crc kubenswrapper[4907]: I1129 14:49:32.093132 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjf77\" (UniqueName: \"kubernetes.io/projected/d866421e-a132-4651-8452-f6cf2af40410-kube-api-access-xjf77\") on node \"crc\" DevicePath \"\"" Nov 29 14:49:32 crc kubenswrapper[4907]: I1129 14:49:32.093145 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d866421e-a132-4651-8452-f6cf2af40410-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:49:32 crc kubenswrapper[4907]: W1129 14:49:32.128404 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c6489ed_0658_49ec_8ae2_a43a8cf795ef.slice/crio-9951e59b95547ad74cda83bf35299825a586d92ba178dc1183897f55aec615cf WatchSource:0}: Error finding container 9951e59b95547ad74cda83bf35299825a586d92ba178dc1183897f55aec615cf: Status 404 returned error can't find the container with id 9951e59b95547ad74cda83bf35299825a586d92ba178dc1183897f55aec615cf Nov 29 14:49:32 crc kubenswrapper[4907]: I1129 14:49:32.336982 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ndrfl"] Nov 29 14:49:32 crc kubenswrapper[4907]: I1129 14:49:32.773267 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"550beb06-c1d9-4568-bb4e-66ff9134cb8e","Type":"ContainerStarted","Data":"57e9e8cb7a73ad419533e89d66570ed4fe9e23124e85292737bf8eb73653d7b3"} Nov 29 14:49:32 crc kubenswrapper[4907]: I1129 14:49:32.775365 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"76797dae-1bc6-4e63-824b-423fab640187","Type":"ContainerStarted","Data":"86cd83e03c748b6de365713e0b1132953d595577fd9e9bc6d680076e522c042d"} Nov 29 14:49:32 crc kubenswrapper[4907]: I1129 14:49:32.777117 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8aee0179-2960-486d-8129-1e928d55a29f","Type":"ContainerStarted","Data":"b6497214a1f83cc739bcf81c7e151db9861c2777d57157ead4c53dcba818c0a6"} Nov 29 14:49:32 crc kubenswrapper[4907]: I1129 14:49:32.778243 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-h69pd" Nov 29 14:49:32 crc kubenswrapper[4907]: I1129 14:49:32.778245 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7c6489ed-0658-49ec-8ae2-a43a8cf795ef","Type":"ContainerStarted","Data":"9951e59b95547ad74cda83bf35299825a586d92ba178dc1183897f55aec615cf"} Nov 29 14:49:32 crc kubenswrapper[4907]: I1129 14:49:32.778321 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w9lr9" Nov 29 14:49:32 crc kubenswrapper[4907]: I1129 14:49:32.896501 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-h69pd"] Nov 29 14:49:32 crc kubenswrapper[4907]: I1129 14:49:32.902723 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-h69pd"] Nov 29 14:49:32 crc kubenswrapper[4907]: I1129 14:49:32.919650 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w9lr9"] Nov 29 14:49:32 crc kubenswrapper[4907]: I1129 14:49:32.929811 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w9lr9"] Nov 29 14:49:34 crc kubenswrapper[4907]: W1129 14:49:34.129397 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3d93208_e155_4746_bfd4_2d6d7d04dc2e.slice/crio-b15429d69927c088d63bc033548cb5a3c2a673f3e25c42c40970586b94267c48 WatchSource:0}: Error finding container b15429d69927c088d63bc033548cb5a3c2a673f3e25c42c40970586b94267c48: Status 404 returned error can't find the container with id b15429d69927c088d63bc033548cb5a3c2a673f3e25c42c40970586b94267c48 Nov 29 14:49:34 crc kubenswrapper[4907]: I1129 14:49:34.503319 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8d62e8f-7e5c-4fe1-b06d-4499b10075e3" path="/var/lib/kubelet/pods/c8d62e8f-7e5c-4fe1-b06d-4499b10075e3/volumes" Nov 29 14:49:34 crc kubenswrapper[4907]: I1129 14:49:34.504179 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d866421e-a132-4651-8452-f6cf2af40410" path="/var/lib/kubelet/pods/d866421e-a132-4651-8452-f6cf2af40410/volumes" Nov 29 14:49:34 crc kubenswrapper[4907]: I1129 14:49:34.798169 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7856986b97-4nnqk" event={"ID":"bff9f61a-b60d-444d-9aa0-1619b76bc7a9","Type":"ContainerStarted","Data":"b86fd687656aa6f2cbd12406367a0c69e0249c2e43abf7e2ccb8b72425e80436"} Nov 29 14:49:34 crc kubenswrapper[4907]: I1129 14:49:34.801414 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ndrfl" event={"ID":"b3d93208-e155-4746-bfd4-2d6d7d04dc2e","Type":"ContainerStarted","Data":"b15429d69927c088d63bc033548cb5a3c2a673f3e25c42c40970586b94267c48"} Nov 29 14:49:34 crc kubenswrapper[4907]: I1129 14:49:34.820078 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7856986b97-4nnqk" podStartSLOduration=16.820061367 podStartE2EDuration="16.820061367s" podCreationTimestamp="2025-11-29 14:49:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:49:34.811991639 +0000 UTC m=+1272.798829301" watchObservedRunningTime="2025-11-29 14:49:34.820061367 +0000 UTC m=+1272.806899029" Nov 29 14:49:39 crc kubenswrapper[4907]: I1129 14:49:39.210488 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:39 crc kubenswrapper[4907]: I1129 14:49:39.211095 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:39 crc kubenswrapper[4907]: I1129 14:49:39.221498 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:39 crc kubenswrapper[4907]: I1129 14:49:39.859230 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7856986b97-4nnqk" Nov 29 14:49:39 crc kubenswrapper[4907]: I1129 14:49:39.937638 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b5dbd778c-bhss9"] Nov 29 14:49:43 crc kubenswrapper[4907]: I1129 14:49:43.905194 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7c6489ed-0658-49ec-8ae2-a43a8cf795ef","Type":"ContainerStarted","Data":"ff54579b6df0563b882515cb46909d116c102de0ef6761a96346f7d1ccae5767"} Nov 29 14:49:43 crc kubenswrapper[4907]: I1129 14:49:43.907822 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b4712d7-a81b-455f-841a-a0ca14eafcbe","Type":"ContainerStarted","Data":"baf549377a42a132bc770a8c0bbbc086032cea3c4284a0938719bc334005118b"} Nov 29 14:49:43 crc kubenswrapper[4907]: I1129 14:49:43.915633 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"76797dae-1bc6-4e63-824b-423fab640187","Type":"ContainerStarted","Data":"c0471f93115c12f2938dbc7f28d3d48a443cee1d7966c30f7bf273a36c395c40"} Nov 29 14:49:43 crc kubenswrapper[4907]: I1129 14:49:43.918587 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"828d6b05-06be-4157-8163-96a3220fedb0","Type":"ContainerStarted","Data":"19e299b98e7052a49ebf82f63a4a18471cd2961de5b0082500847e9399203772"} Nov 29 14:49:43 crc kubenswrapper[4907]: I1129 14:49:43.925208 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e9f58cda-06bb-41f5-b91d-fdf10dab6164","Type":"ContainerStarted","Data":"fe040a02f5460fbb07152c7f84e13523eddefef2b342d5269f1fc91bf7543092"} Nov 29 14:49:43 crc kubenswrapper[4907]: I1129 14:49:43.928039 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m4mz2" event={"ID":"33f5965b-43ae-484d-9c5c-1a54ae4de6da","Type":"ContainerStarted","Data":"8f72c2e9abe6809de82784161ca9d5b075d7fbb8c5d9d68b6771eec6c0de4693"} Nov 29 14:49:43 crc kubenswrapper[4907]: I1129 14:49:43.930971 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ce177523-3519-4f04-b71c-7869b8bb5810","Type":"ContainerStarted","Data":"cf8589876a6e854b169096f90c834a8b201454ffd021d1f879afe255287d27af"} Nov 29 14:49:43 crc kubenswrapper[4907]: I1129 14:49:43.931717 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 29 14:49:43 crc kubenswrapper[4907]: I1129 14:49:43.933146 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ndrfl" event={"ID":"b3d93208-e155-4746-bfd4-2d6d7d04dc2e","Type":"ContainerStarted","Data":"374a09636309b319f39e59568ce9411c02a94711ccd4699341670c6fc1bbe8df"} Nov 29 14:49:43 crc kubenswrapper[4907]: I1129 14:49:43.935794 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-v6mrf" event={"ID":"985db950-2dae-4f2f-8ea4-289b661b1481","Type":"ContainerStarted","Data":"561f60af13b5f8eb8445f3cbfaa6e11fe06f7663d91445503d2a8d85f98b54c0"} Nov 29 14:49:43 crc kubenswrapper[4907]: I1129 14:49:43.995022 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-v6mrf" podStartSLOduration=15.29195257 podStartE2EDuration="25.995003713s" podCreationTimestamp="2025-11-29 14:49:18 +0000 UTC" firstStartedPulling="2025-11-29 14:49:31.555301168 +0000 UTC m=+1269.542138820" lastFinishedPulling="2025-11-29 14:49:42.258352281 +0000 UTC m=+1280.245189963" observedRunningTime="2025-11-29 14:49:43.988606093 +0000 UTC m=+1281.975443745" watchObservedRunningTime="2025-11-29 14:49:43.995003713 +0000 UTC m=+1281.981841355" Nov 29 14:49:44 crc kubenswrapper[4907]: I1129 14:49:44.036542 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.742776499 podStartE2EDuration="27.036523913s" podCreationTimestamp="2025-11-29 14:49:17 +0000 UTC" firstStartedPulling="2025-11-29 14:49:30.026869432 +0000 UTC m=+1268.013707144" lastFinishedPulling="2025-11-29 14:49:42.320616906 +0000 UTC m=+1280.307454558" observedRunningTime="2025-11-29 14:49:44.02043454 +0000 UTC m=+1282.007272212" watchObservedRunningTime="2025-11-29 14:49:44.036523913 +0000 UTC m=+1282.023361565" Nov 29 14:49:44 crc kubenswrapper[4907]: I1129 14:49:44.879200 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-225jx"] Nov 29 14:49:44 crc kubenswrapper[4907]: I1129 14:49:44.882600 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:44 crc kubenswrapper[4907]: I1129 14:49:44.889467 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-225jx"] Nov 29 14:49:44 crc kubenswrapper[4907]: I1129 14:49:44.894451 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.013913 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e24485-836a-4a5d-a183-2f8dc0de5c07-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.013982 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/79e24485-836a-4a5d-a183-2f8dc0de5c07-ovs-rundir\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.014060 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgrs4\" (UniqueName: \"kubernetes.io/projected/79e24485-836a-4a5d-a183-2f8dc0de5c07-kube-api-access-hgrs4\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.014306 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e24485-836a-4a5d-a183-2f8dc0de5c07-combined-ca-bundle\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.015497 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79e24485-836a-4a5d-a183-2f8dc0de5c07-config\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.017084 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/79e24485-836a-4a5d-a183-2f8dc0de5c07-ovn-rundir\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.054658 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hrnbt"] Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.071799 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-h9rgl"] Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.073349 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.083030 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.108421 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-h9rgl"] Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.125878 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e24485-836a-4a5d-a183-2f8dc0de5c07-combined-ca-bundle\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.126009 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79e24485-836a-4a5d-a183-2f8dc0de5c07-config\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.126071 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htbfz\" (UniqueName: \"kubernetes.io/projected/4e555346-7e76-4f4c-a996-ce05d506d038-kube-api-access-htbfz\") pod \"dnsmasq-dns-7fd796d7df-h9rgl\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.126109 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/79e24485-836a-4a5d-a183-2f8dc0de5c07-ovn-rundir\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.126144 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-config\") pod \"dnsmasq-dns-7fd796d7df-h9rgl\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.126413 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-h9rgl\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.126535 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e24485-836a-4a5d-a183-2f8dc0de5c07-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.126599 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/79e24485-836a-4a5d-a183-2f8dc0de5c07-ovs-rundir\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.126684 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgrs4\" (UniqueName: \"kubernetes.io/projected/79e24485-836a-4a5d-a183-2f8dc0de5c07-kube-api-access-hgrs4\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.126738 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-h9rgl\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.126760 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79e24485-836a-4a5d-a183-2f8dc0de5c07-config\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.126740 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/79e24485-836a-4a5d-a183-2f8dc0de5c07-ovs-rundir\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.126904 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/79e24485-836a-4a5d-a183-2f8dc0de5c07-ovn-rundir\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.131659 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e24485-836a-4a5d-a183-2f8dc0de5c07-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.135531 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e24485-836a-4a5d-a183-2f8dc0de5c07-combined-ca-bundle\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.154907 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgrs4\" (UniqueName: \"kubernetes.io/projected/79e24485-836a-4a5d-a183-2f8dc0de5c07-kube-api-access-hgrs4\") pod \"ovn-controller-metrics-225jx\" (UID: \"79e24485-836a-4a5d-a183-2f8dc0de5c07\") " pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.203143 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-225jx" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.213303 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dfdx9"] Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.229060 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htbfz\" (UniqueName: \"kubernetes.io/projected/4e555346-7e76-4f4c-a996-ce05d506d038-kube-api-access-htbfz\") pod \"dnsmasq-dns-7fd796d7df-h9rgl\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.229118 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-config\") pod \"dnsmasq-dns-7fd796d7df-h9rgl\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.229170 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-h9rgl\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.229246 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-h9rgl\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.230052 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-config\") pod \"dnsmasq-dns-7fd796d7df-h9rgl\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.232253 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-h9rgl\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.232269 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-h9rgl\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.254515 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s2mxh"] Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.256186 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.259393 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.266197 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htbfz\" (UniqueName: \"kubernetes.io/projected/4e555346-7e76-4f4c-a996-ce05d506d038-kube-api-access-htbfz\") pod \"dnsmasq-dns-7fd796d7df-h9rgl\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.270830 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s2mxh"] Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.330558 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-s2mxh\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.330660 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-config\") pod \"dnsmasq-dns-86db49b7ff-s2mxh\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.330786 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plrh5\" (UniqueName: \"kubernetes.io/projected/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-kube-api-access-plrh5\") pod \"dnsmasq-dns-86db49b7ff-s2mxh\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.330924 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-s2mxh\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.331029 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-s2mxh\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.401392 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.432984 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-s2mxh\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.433031 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-config\") pod \"dnsmasq-dns-86db49b7ff-s2mxh\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.433070 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plrh5\" (UniqueName: \"kubernetes.io/projected/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-kube-api-access-plrh5\") pod \"dnsmasq-dns-86db49b7ff-s2mxh\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.433110 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-s2mxh\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.433143 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-s2mxh\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.433989 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-s2mxh\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.434513 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-s2mxh\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.434993 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-config\") pod \"dnsmasq-dns-86db49b7ff-s2mxh\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.435781 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-s2mxh\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.481052 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plrh5\" (UniqueName: \"kubernetes.io/projected/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-kube-api-access-plrh5\") pod \"dnsmasq-dns-86db49b7ff-s2mxh\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.646895 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.793863 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-225jx"] Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.954159 4907 generic.go:334] "Generic (PLEG): container finished" podID="b3d93208-e155-4746-bfd4-2d6d7d04dc2e" containerID="374a09636309b319f39e59568ce9411c02a94711ccd4699341670c6fc1bbe8df" exitCode=0 Nov 29 14:49:45 crc kubenswrapper[4907]: I1129 14:49:45.954452 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ndrfl" event={"ID":"b3d93208-e155-4746-bfd4-2d6d7d04dc2e","Type":"ContainerDied","Data":"374a09636309b319f39e59568ce9411c02a94711ccd4699341670c6fc1bbe8df"} Nov 29 14:49:45 crc kubenswrapper[4907]: W1129 14:49:45.984678 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79e24485_836a_4a5d_a183_2f8dc0de5c07.slice/crio-1aec768d5aa82214a53b766665ff03e35a78f43b744c3d972a128f293c3d2a2a WatchSource:0}: Error finding container 1aec768d5aa82214a53b766665ff03e35a78f43b744c3d972a128f293c3d2a2a: Status 404 returned error can't find the container with id 1aec768d5aa82214a53b766665ff03e35a78f43b744c3d972a128f293c3d2a2a Nov 29 14:49:46 crc kubenswrapper[4907]: I1129 14:49:46.160048 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-h9rgl"] Nov 29 14:49:46 crc kubenswrapper[4907]: I1129 14:49:46.259476 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s2mxh"] Nov 29 14:49:46 crc kubenswrapper[4907]: I1129 14:49:46.972850 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" event={"ID":"4e555346-7e76-4f4c-a996-ce05d506d038","Type":"ContainerStarted","Data":"54432e9d7e183902aaa06a5a7cb3755567458ed5a9245b1e283803a1a8fa4365"} Nov 29 14:49:46 crc kubenswrapper[4907]: I1129 14:49:46.974128 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-225jx" event={"ID":"79e24485-836a-4a5d-a183-2f8dc0de5c07","Type":"ContainerStarted","Data":"1aec768d5aa82214a53b766665ff03e35a78f43b744c3d972a128f293c3d2a2a"} Nov 29 14:49:46 crc kubenswrapper[4907]: I1129 14:49:46.975934 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" event={"ID":"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a","Type":"ContainerStarted","Data":"4b90530062f4362c6ac6c029361d5c91c740fdbb89b95b63a090ac891c29f1c5"} Nov 29 14:49:46 crc kubenswrapper[4907]: I1129 14:49:46.976014 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-m4mz2" Nov 29 14:49:47 crc kubenswrapper[4907]: I1129 14:49:47.004964 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-m4mz2" podStartSLOduration=14.310763301 podStartE2EDuration="25.004944854s" podCreationTimestamp="2025-11-29 14:49:22 +0000 UTC" firstStartedPulling="2025-11-29 14:49:31.655333357 +0000 UTC m=+1269.642171009" lastFinishedPulling="2025-11-29 14:49:42.34951491 +0000 UTC m=+1280.336352562" observedRunningTime="2025-11-29 14:49:46.996153946 +0000 UTC m=+1284.982991618" watchObservedRunningTime="2025-11-29 14:49:47.004944854 +0000 UTC m=+1284.991782516" Nov 29 14:49:47 crc kubenswrapper[4907]: I1129 14:49:47.041221 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=21.45346456 podStartE2EDuration="32.041199845s" podCreationTimestamp="2025-11-29 14:49:15 +0000 UTC" firstStartedPulling="2025-11-29 14:49:31.657316352 +0000 UTC m=+1269.644154004" lastFinishedPulling="2025-11-29 14:49:42.245051637 +0000 UTC m=+1280.231889289" observedRunningTime="2025-11-29 14:49:47.034050314 +0000 UTC m=+1285.020887976" watchObservedRunningTime="2025-11-29 14:49:47.041199845 +0000 UTC m=+1285.028037507" Nov 29 14:49:48 crc kubenswrapper[4907]: I1129 14:49:48.182651 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 29 14:49:51 crc kubenswrapper[4907]: I1129 14:49:51.017245 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2e724b-ab17-45b3-a5ec-c43bf54e935d","Type":"ContainerStarted","Data":"aa38b9a165cea5bc8a9d35d4608f4013dc3e0b908f032f057df0bb2cf95050b9"} Nov 29 14:49:51 crc kubenswrapper[4907]: I1129 14:49:51.019053 4907 generic.go:334] "Generic (PLEG): container finished" podID="37167587-89bd-4132-89b5-c71426846175" containerID="3fcbe3696a5b00e8d25a01e4ae0acb58df033292c7ba1652e709ddb48de06023" exitCode=0 Nov 29 14:49:51 crc kubenswrapper[4907]: I1129 14:49:51.019155 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" event={"ID":"37167587-89bd-4132-89b5-c71426846175","Type":"ContainerDied","Data":"3fcbe3696a5b00e8d25a01e4ae0acb58df033292c7ba1652e709ddb48de06023"} Nov 29 14:49:51 crc kubenswrapper[4907]: I1129 14:49:51.021328 4907 generic.go:334] "Generic (PLEG): container finished" podID="5dbd4b99-a473-4c9b-a372-492c3ae34064" containerID="f4cead3684c7abd4529cdffc2a128775d8abd2647cfb5d8eaa770bef60150f51" exitCode=0 Nov 29 14:49:51 crc kubenswrapper[4907]: I1129 14:49:51.021490 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" event={"ID":"5dbd4b99-a473-4c9b-a372-492c3ae34064","Type":"ContainerDied","Data":"f4cead3684c7abd4529cdffc2a128775d8abd2647cfb5d8eaa770bef60150f51"} Nov 29 14:49:51 crc kubenswrapper[4907]: I1129 14:49:51.022508 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 29 14:49:51 crc kubenswrapper[4907]: I1129 14:49:51.023662 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 29 14:49:51 crc kubenswrapper[4907]: I1129 14:49:51.023949 4907 generic.go:334] "Generic (PLEG): container finished" podID="4e555346-7e76-4f4c-a996-ce05d506d038" containerID="1bb93d8aec80761ba9df7c1cd8a86592df51a805cf89937e41f5874cb909f74b" exitCode=0 Nov 29 14:49:51 crc kubenswrapper[4907]: I1129 14:49:51.024040 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" event={"ID":"4e555346-7e76-4f4c-a996-ce05d506d038","Type":"ContainerDied","Data":"1bb93d8aec80761ba9df7c1cd8a86592df51a805cf89937e41f5874cb909f74b"} Nov 29 14:49:51 crc kubenswrapper[4907]: I1129 14:49:51.026741 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ndrfl" event={"ID":"b3d93208-e155-4746-bfd4-2d6d7d04dc2e","Type":"ContainerStarted","Data":"55f65814958654486e1892c136b69fd3a82d04b1c6e42f38ed4df60364354ebb"} Nov 29 14:49:51 crc kubenswrapper[4907]: I1129 14:49:51.026795 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ndrfl" event={"ID":"b3d93208-e155-4746-bfd4-2d6d7d04dc2e","Type":"ContainerStarted","Data":"da57e19c88c077470c7a452a4831ea3cb667b0286e67c832f68f3fb066cebaf9"} Nov 29 14:49:51 crc kubenswrapper[4907]: I1129 14:49:51.027037 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:51 crc kubenswrapper[4907]: I1129 14:49:51.027093 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:49:51 crc kubenswrapper[4907]: I1129 14:49:51.084317 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-ndrfl" podStartSLOduration=20.925203191 podStartE2EDuration="29.084297595s" podCreationTimestamp="2025-11-29 14:49:22 +0000 UTC" firstStartedPulling="2025-11-29 14:49:34.167924502 +0000 UTC m=+1272.154762154" lastFinishedPulling="2025-11-29 14:49:42.327018896 +0000 UTC m=+1280.313856558" observedRunningTime="2025-11-29 14:49:51.08374519 +0000 UTC m=+1289.070582842" watchObservedRunningTime="2025-11-29 14:49:51.084297595 +0000 UTC m=+1289.071135247" Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.502543 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.509463 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.632668 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37167587-89bd-4132-89b5-c71426846175-config\") pod \"37167587-89bd-4132-89b5-c71426846175\" (UID: \"37167587-89bd-4132-89b5-c71426846175\") " Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.632728 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37167587-89bd-4132-89b5-c71426846175-dns-svc\") pod \"37167587-89bd-4132-89b5-c71426846175\" (UID: \"37167587-89bd-4132-89b5-c71426846175\") " Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.632803 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5dbd4b99-a473-4c9b-a372-492c3ae34064-dns-svc\") pod \"5dbd4b99-a473-4c9b-a372-492c3ae34064\" (UID: \"5dbd4b99-a473-4c9b-a372-492c3ae34064\") " Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.632847 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhpcm\" (UniqueName: \"kubernetes.io/projected/37167587-89bd-4132-89b5-c71426846175-kube-api-access-mhpcm\") pod \"37167587-89bd-4132-89b5-c71426846175\" (UID: \"37167587-89bd-4132-89b5-c71426846175\") " Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.632916 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmkh5\" (UniqueName: \"kubernetes.io/projected/5dbd4b99-a473-4c9b-a372-492c3ae34064-kube-api-access-cmkh5\") pod \"5dbd4b99-a473-4c9b-a372-492c3ae34064\" (UID: \"5dbd4b99-a473-4c9b-a372-492c3ae34064\") " Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.632987 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dbd4b99-a473-4c9b-a372-492c3ae34064-config\") pod \"5dbd4b99-a473-4c9b-a372-492c3ae34064\" (UID: \"5dbd4b99-a473-4c9b-a372-492c3ae34064\") " Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.639608 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37167587-89bd-4132-89b5-c71426846175-kube-api-access-mhpcm" (OuterVolumeSpecName: "kube-api-access-mhpcm") pod "37167587-89bd-4132-89b5-c71426846175" (UID: "37167587-89bd-4132-89b5-c71426846175"). InnerVolumeSpecName "kube-api-access-mhpcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.644692 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dbd4b99-a473-4c9b-a372-492c3ae34064-kube-api-access-cmkh5" (OuterVolumeSpecName: "kube-api-access-cmkh5") pod "5dbd4b99-a473-4c9b-a372-492c3ae34064" (UID: "5dbd4b99-a473-4c9b-a372-492c3ae34064"). InnerVolumeSpecName "kube-api-access-cmkh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.666251 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dbd4b99-a473-4c9b-a372-492c3ae34064-config" (OuterVolumeSpecName: "config") pod "5dbd4b99-a473-4c9b-a372-492c3ae34064" (UID: "5dbd4b99-a473-4c9b-a372-492c3ae34064"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.675672 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37167587-89bd-4132-89b5-c71426846175-config" (OuterVolumeSpecName: "config") pod "37167587-89bd-4132-89b5-c71426846175" (UID: "37167587-89bd-4132-89b5-c71426846175"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.677043 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dbd4b99-a473-4c9b-a372-492c3ae34064-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5dbd4b99-a473-4c9b-a372-492c3ae34064" (UID: "5dbd4b99-a473-4c9b-a372-492c3ae34064"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.677835 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37167587-89bd-4132-89b5-c71426846175-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37167587-89bd-4132-89b5-c71426846175" (UID: "37167587-89bd-4132-89b5-c71426846175"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.735045 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37167587-89bd-4132-89b5-c71426846175-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.735083 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37167587-89bd-4132-89b5-c71426846175-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.735097 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5dbd4b99-a473-4c9b-a372-492c3ae34064-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.735110 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhpcm\" (UniqueName: \"kubernetes.io/projected/37167587-89bd-4132-89b5-c71426846175-kube-api-access-mhpcm\") on node \"crc\" DevicePath \"\"" Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.735124 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmkh5\" (UniqueName: \"kubernetes.io/projected/5dbd4b99-a473-4c9b-a372-492c3ae34064-kube-api-access-cmkh5\") on node \"crc\" DevicePath \"\"" Nov 29 14:49:52 crc kubenswrapper[4907]: I1129 14:49:52.735134 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dbd4b99-a473-4c9b-a372-492c3ae34064-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:49:53 crc kubenswrapper[4907]: I1129 14:49:53.050981 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" Nov 29 14:49:53 crc kubenswrapper[4907]: I1129 14:49:53.050972 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dfdx9" event={"ID":"37167587-89bd-4132-89b5-c71426846175","Type":"ContainerDied","Data":"953a921c795234092ac3748e94973184f5f1bd9638f3458619a904d20cfc1890"} Nov 29 14:49:53 crc kubenswrapper[4907]: I1129 14:49:53.051875 4907 scope.go:117] "RemoveContainer" containerID="3fcbe3696a5b00e8d25a01e4ae0acb58df033292c7ba1652e709ddb48de06023" Nov 29 14:49:53 crc kubenswrapper[4907]: I1129 14:49:53.053552 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" event={"ID":"5dbd4b99-a473-4c9b-a372-492c3ae34064","Type":"ContainerDied","Data":"f4f825a49a46a56d9224b516811d9a0db9e3eae7571e232accb7a9a6e804284a"} Nov 29 14:49:53 crc kubenswrapper[4907]: I1129 14:49:53.053668 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hrnbt" Nov 29 14:49:53 crc kubenswrapper[4907]: I1129 14:49:53.062783 4907 generic.go:334] "Generic (PLEG): container finished" podID="e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a" containerID="50efbf000a2ac92787a8ab43e6298d71203d3ba3b5a4f188572ff55c0298dd18" exitCode=0 Nov 29 14:49:53 crc kubenswrapper[4907]: I1129 14:49:53.062824 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" event={"ID":"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a","Type":"ContainerDied","Data":"50efbf000a2ac92787a8ab43e6298d71203d3ba3b5a4f188572ff55c0298dd18"} Nov 29 14:49:53 crc kubenswrapper[4907]: I1129 14:49:53.076730 4907 scope.go:117] "RemoveContainer" containerID="f4cead3684c7abd4529cdffc2a128775d8abd2647cfb5d8eaa770bef60150f51" Nov 29 14:49:53 crc kubenswrapper[4907]: I1129 14:49:53.169459 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hrnbt"] Nov 29 14:49:53 crc kubenswrapper[4907]: I1129 14:49:53.185530 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hrnbt"] Nov 29 14:49:53 crc kubenswrapper[4907]: I1129 14:49:53.218146 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dfdx9"] Nov 29 14:49:53 crc kubenswrapper[4907]: I1129 14:49:53.231161 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dfdx9"] Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.079173 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" event={"ID":"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a","Type":"ContainerStarted","Data":"86a936c5d892dcca038a100252784738ca3de880645f6a9565b680176372c593"} Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.081861 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.084144 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7c6489ed-0658-49ec-8ae2-a43a8cf795ef","Type":"ContainerStarted","Data":"d81701c62337af10e464979b853bc29f2db9c590b4e3a75cc98a36126f4a26ec"} Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.089842 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" event={"ID":"4e555346-7e76-4f4c-a996-ce05d506d038","Type":"ContainerStarted","Data":"3be33b0687af284ab518f192ad5fecb49160e0347fe2bc8fb457320c88ef4a06"} Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.090123 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.093085 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"76797dae-1bc6-4e63-824b-423fab640187","Type":"ContainerStarted","Data":"7e8762214337ad9ada09bade7fd8fd15fb5b7fb7d0cad1937a9ff54bdf3873bc"} Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.095234 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-225jx" event={"ID":"79e24485-836a-4a5d-a183-2f8dc0de5c07","Type":"ContainerStarted","Data":"22cfe31df5965dc65200f9bcd3c3957f40a1d0166afdf2302d0bf87f61c526b4"} Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.124557 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" podStartSLOduration=5.015553182 podStartE2EDuration="9.124524438s" podCreationTimestamp="2025-11-29 14:49:45 +0000 UTC" firstStartedPulling="2025-11-29 14:49:46.271659692 +0000 UTC m=+1284.258497344" lastFinishedPulling="2025-11-29 14:49:50.380630948 +0000 UTC m=+1288.367468600" observedRunningTime="2025-11-29 14:49:54.107252471 +0000 UTC m=+1292.094090123" watchObservedRunningTime="2025-11-29 14:49:54.124524438 +0000 UTC m=+1292.111362130" Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.145714 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.293188156 podStartE2EDuration="31.145687064s" podCreationTimestamp="2025-11-29 14:49:23 +0000 UTC" firstStartedPulling="2025-11-29 14:49:32.131054071 +0000 UTC m=+1270.117891723" lastFinishedPulling="2025-11-29 14:49:52.983552979 +0000 UTC m=+1290.970390631" observedRunningTime="2025-11-29 14:49:54.134056316 +0000 UTC m=+1292.120893998" watchObservedRunningTime="2025-11-29 14:49:54.145687064 +0000 UTC m=+1292.132524756" Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.165356 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" podStartSLOduration=5.034316411 podStartE2EDuration="9.165338678s" podCreationTimestamp="2025-11-29 14:49:45 +0000 UTC" firstStartedPulling="2025-11-29 14:49:46.162764334 +0000 UTC m=+1284.149601986" lastFinishedPulling="2025-11-29 14:49:50.293786601 +0000 UTC m=+1288.280624253" observedRunningTime="2025-11-29 14:49:54.158620238 +0000 UTC m=+1292.145457930" watchObservedRunningTime="2025-11-29 14:49:54.165338678 +0000 UTC m=+1292.152176330" Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.210433 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-225jx" podStartSLOduration=3.192268782 podStartE2EDuration="10.210413298s" podCreationTimestamp="2025-11-29 14:49:44 +0000 UTC" firstStartedPulling="2025-11-29 14:49:45.987033022 +0000 UTC m=+1283.973870674" lastFinishedPulling="2025-11-29 14:49:53.005177538 +0000 UTC m=+1290.992015190" observedRunningTime="2025-11-29 14:49:54.181690738 +0000 UTC m=+1292.168528430" watchObservedRunningTime="2025-11-29 14:49:54.210413298 +0000 UTC m=+1292.197250960" Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.212005 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=14.093523579 podStartE2EDuration="35.211998272s" podCreationTimestamp="2025-11-29 14:49:19 +0000 UTC" firstStartedPulling="2025-11-29 14:49:31.863009478 +0000 UTC m=+1269.849847150" lastFinishedPulling="2025-11-29 14:49:52.981484191 +0000 UTC m=+1290.968321843" observedRunningTime="2025-11-29 14:49:54.207231558 +0000 UTC m=+1292.194069250" watchObservedRunningTime="2025-11-29 14:49:54.211998272 +0000 UTC m=+1292.198835934" Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.296764 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.349759 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.492138 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37167587-89bd-4132-89b5-c71426846175" path="/var/lib/kubelet/pods/37167587-89bd-4132-89b5-c71426846175/volumes" Nov 29 14:49:54 crc kubenswrapper[4907]: I1129 14:49:54.492678 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dbd4b99-a473-4c9b-a372-492c3ae34064" path="/var/lib/kubelet/pods/5dbd4b99-a473-4c9b-a372-492c3ae34064/volumes" Nov 29 14:49:55 crc kubenswrapper[4907]: I1129 14:49:55.106956 4907 generic.go:334] "Generic (PLEG): container finished" podID="e9f58cda-06bb-41f5-b91d-fdf10dab6164" containerID="fe040a02f5460fbb07152c7f84e13523eddefef2b342d5269f1fc91bf7543092" exitCode=0 Nov 29 14:49:55 crc kubenswrapper[4907]: I1129 14:49:55.107073 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e9f58cda-06bb-41f5-b91d-fdf10dab6164","Type":"ContainerDied","Data":"fe040a02f5460fbb07152c7f84e13523eddefef2b342d5269f1fc91bf7543092"} Nov 29 14:49:55 crc kubenswrapper[4907]: I1129 14:49:55.111399 4907 generic.go:334] "Generic (PLEG): container finished" podID="2b4712d7-a81b-455f-841a-a0ca14eafcbe" containerID="baf549377a42a132bc770a8c0bbbc086032cea3c4284a0938719bc334005118b" exitCode=0 Nov 29 14:49:55 crc kubenswrapper[4907]: I1129 14:49:55.111624 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b4712d7-a81b-455f-841a-a0ca14eafcbe","Type":"ContainerDied","Data":"baf549377a42a132bc770a8c0bbbc086032cea3c4284a0938719bc334005118b"} Nov 29 14:49:55 crc kubenswrapper[4907]: I1129 14:49:55.113420 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:55 crc kubenswrapper[4907]: I1129 14:49:55.175689 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:55 crc kubenswrapper[4907]: I1129 14:49:55.175764 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:55 crc kubenswrapper[4907]: I1129 14:49:55.195174 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 29 14:49:55 crc kubenswrapper[4907]: I1129 14:49:55.266639 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.125193 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e9f58cda-06bb-41f5-b91d-fdf10dab6164","Type":"ContainerStarted","Data":"48195e87c4d808aa075f2412ac2ff2ba86453cdaf465a30a211abbb3ab7e6ab0"} Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.127404 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2b4712d7-a81b-455f-841a-a0ca14eafcbe","Type":"ContainerStarted","Data":"dd6152d4655751b599bce9356b941c7704fc0c4e01bcf1ce3cc5df573efa0838"} Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.149843 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.150143 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.159006 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=32.258383478 podStartE2EDuration="42.158982041s" podCreationTimestamp="2025-11-29 14:49:14 +0000 UTC" firstStartedPulling="2025-11-29 14:49:31.447663885 +0000 UTC m=+1269.434501537" lastFinishedPulling="2025-11-29 14:49:41.348262418 +0000 UTC m=+1279.335100100" observedRunningTime="2025-11-29 14:49:56.152766526 +0000 UTC m=+1294.139604188" watchObservedRunningTime="2025-11-29 14:49:56.158982041 +0000 UTC m=+1294.145819703" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.188975 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=36.931962856 podStartE2EDuration="43.188956915s" podCreationTimestamp="2025-11-29 14:49:13 +0000 UTC" firstStartedPulling="2025-11-29 14:49:29.194065217 +0000 UTC m=+1267.180902869" lastFinishedPulling="2025-11-29 14:49:35.451059276 +0000 UTC m=+1273.437896928" observedRunningTime="2025-11-29 14:49:56.186253709 +0000 UTC m=+1294.173091381" watchObservedRunningTime="2025-11-29 14:49:56.188956915 +0000 UTC m=+1294.175794567" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.217914 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.409342 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 29 14:49:56 crc kubenswrapper[4907]: E1129 14:49:56.409805 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dbd4b99-a473-4c9b-a372-492c3ae34064" containerName="init" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.409830 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbd4b99-a473-4c9b-a372-492c3ae34064" containerName="init" Nov 29 14:49:56 crc kubenswrapper[4907]: E1129 14:49:56.409885 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37167587-89bd-4132-89b5-c71426846175" containerName="init" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.409894 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="37167587-89bd-4132-89b5-c71426846175" containerName="init" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.410138 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dbd4b99-a473-4c9b-a372-492c3ae34064" containerName="init" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.410163 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="37167587-89bd-4132-89b5-c71426846175" containerName="init" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.411426 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.418345 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.419020 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.419504 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-fdsrc" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.419660 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.425037 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.535879 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8b80af-94d6-4c43-887b-07aafa877200-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.535932 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8b80af-94d6-4c43-887b-07aafa877200-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.536356 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8b80af-94d6-4c43-887b-07aafa877200-config\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.536489 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8b80af-94d6-4c43-887b-07aafa877200-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.536518 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc8b80af-94d6-4c43-887b-07aafa877200-scripts\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.536665 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc8b80af-94d6-4c43-887b-07aafa877200-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.536696 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z989n\" (UniqueName: \"kubernetes.io/projected/fc8b80af-94d6-4c43-887b-07aafa877200-kube-api-access-z989n\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.638310 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8b80af-94d6-4c43-887b-07aafa877200-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.638351 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8b80af-94d6-4c43-887b-07aafa877200-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.638402 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8b80af-94d6-4c43-887b-07aafa877200-config\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.638490 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8b80af-94d6-4c43-887b-07aafa877200-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.638508 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc8b80af-94d6-4c43-887b-07aafa877200-scripts\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.638555 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc8b80af-94d6-4c43-887b-07aafa877200-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.638571 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z989n\" (UniqueName: \"kubernetes.io/projected/fc8b80af-94d6-4c43-887b-07aafa877200-kube-api-access-z989n\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.639682 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc8b80af-94d6-4c43-887b-07aafa877200-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.640228 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc8b80af-94d6-4c43-887b-07aafa877200-scripts\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.640734 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8b80af-94d6-4c43-887b-07aafa877200-config\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.644247 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8b80af-94d6-4c43-887b-07aafa877200-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.647361 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8b80af-94d6-4c43-887b-07aafa877200-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.647980 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8b80af-94d6-4c43-887b-07aafa877200-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.658772 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z989n\" (UniqueName: \"kubernetes.io/projected/fc8b80af-94d6-4c43-887b-07aafa877200-kube-api-access-z989n\") pod \"ovn-northd-0\" (UID: \"fc8b80af-94d6-4c43-887b-07aafa877200\") " pod="openstack/ovn-northd-0" Nov 29 14:49:56 crc kubenswrapper[4907]: I1129 14:49:56.760905 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 29 14:49:57 crc kubenswrapper[4907]: I1129 14:49:57.108943 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 29 14:49:57 crc kubenswrapper[4907]: I1129 14:49:57.145592 4907 generic.go:334] "Generic (PLEG): container finished" podID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerID="aa38b9a165cea5bc8a9d35d4608f4013dc3e0b908f032f057df0bb2cf95050b9" exitCode=0 Nov 29 14:49:57 crc kubenswrapper[4907]: I1129 14:49:57.145674 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2e724b-ab17-45b3-a5ec-c43bf54e935d","Type":"ContainerDied","Data":"aa38b9a165cea5bc8a9d35d4608f4013dc3e0b908f032f057df0bb2cf95050b9"} Nov 29 14:49:57 crc kubenswrapper[4907]: I1129 14:49:57.147570 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fc8b80af-94d6-4c43-887b-07aafa877200","Type":"ContainerStarted","Data":"bc42c457835b6d79245cc7ffa4abeb28cd737082ca01b5aa5b007f3f770c72af"} Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.200838 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-h9rgl"] Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.201257 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" podUID="4e555346-7e76-4f4c-a996-ce05d506d038" containerName="dnsmasq-dns" containerID="cri-o://3be33b0687af284ab518f192ad5fecb49160e0347fe2bc8fb457320c88ef4a06" gracePeriod=10 Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.213300 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.231827 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-7w4m2"] Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.233382 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.277121 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7w4m2"] Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.378123 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lhb9\" (UniqueName: \"kubernetes.io/projected/13de3308-18f0-431d-997a-9288da0f520a-kube-api-access-5lhb9\") pod \"dnsmasq-dns-698758b865-7w4m2\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.378389 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-config\") pod \"dnsmasq-dns-698758b865-7w4m2\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.378467 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-7w4m2\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.378699 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-dns-svc\") pod \"dnsmasq-dns-698758b865-7w4m2\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.378751 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-7w4m2\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.479730 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-7w4m2\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.479791 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-dns-svc\") pod \"dnsmasq-dns-698758b865-7w4m2\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.479846 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-7w4m2\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.479892 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lhb9\" (UniqueName: \"kubernetes.io/projected/13de3308-18f0-431d-997a-9288da0f520a-kube-api-access-5lhb9\") pod \"dnsmasq-dns-698758b865-7w4m2\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.479925 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-config\") pod \"dnsmasq-dns-698758b865-7w4m2\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.480878 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-config\") pod \"dnsmasq-dns-698758b865-7w4m2\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.480971 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-dns-svc\") pod \"dnsmasq-dns-698758b865-7w4m2\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.481024 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-7w4m2\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.482091 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-7w4m2\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.502651 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lhb9\" (UniqueName: \"kubernetes.io/projected/13de3308-18f0-431d-997a-9288da0f520a-kube-api-access-5lhb9\") pod \"dnsmasq-dns-698758b865-7w4m2\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:58 crc kubenswrapper[4907]: I1129 14:49:58.625374 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.039491 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.091128 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-config\") pod \"4e555346-7e76-4f4c-a996-ce05d506d038\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.091305 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htbfz\" (UniqueName: \"kubernetes.io/projected/4e555346-7e76-4f4c-a996-ce05d506d038-kube-api-access-htbfz\") pod \"4e555346-7e76-4f4c-a996-ce05d506d038\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.091370 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-dns-svc\") pod \"4e555346-7e76-4f4c-a996-ce05d506d038\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.091393 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-ovsdbserver-nb\") pod \"4e555346-7e76-4f4c-a996-ce05d506d038\" (UID: \"4e555346-7e76-4f4c-a996-ce05d506d038\") " Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.097334 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e555346-7e76-4f4c-a996-ce05d506d038-kube-api-access-htbfz" (OuterVolumeSpecName: "kube-api-access-htbfz") pod "4e555346-7e76-4f4c-a996-ce05d506d038" (UID: "4e555346-7e76-4f4c-a996-ce05d506d038"). InnerVolumeSpecName "kube-api-access-htbfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.151091 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e555346-7e76-4f4c-a996-ce05d506d038" (UID: "4e555346-7e76-4f4c-a996-ce05d506d038"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.159883 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-config" (OuterVolumeSpecName: "config") pod "4e555346-7e76-4f4c-a996-ce05d506d038" (UID: "4e555346-7e76-4f4c-a996-ce05d506d038"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.175169 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e555346-7e76-4f4c-a996-ce05d506d038" (UID: "4e555346-7e76-4f4c-a996-ce05d506d038"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.194536 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.194566 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htbfz\" (UniqueName: \"kubernetes.io/projected/4e555346-7e76-4f4c-a996-ce05d506d038-kube-api-access-htbfz\") on node \"crc\" DevicePath \"\"" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.194577 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.194586 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e555346-7e76-4f4c-a996-ce05d506d038-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.214717 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fc8b80af-94d6-4c43-887b-07aafa877200","Type":"ContainerStarted","Data":"a75623c9bb4b6f001a17286ad2148be6fa81b5d61feaab648ca7b308d69c5f19"} Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.222200 4907 generic.go:334] "Generic (PLEG): container finished" podID="4e555346-7e76-4f4c-a996-ce05d506d038" containerID="3be33b0687af284ab518f192ad5fecb49160e0347fe2bc8fb457320c88ef4a06" exitCode=0 Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.222243 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" event={"ID":"4e555346-7e76-4f4c-a996-ce05d506d038","Type":"ContainerDied","Data":"3be33b0687af284ab518f192ad5fecb49160e0347fe2bc8fb457320c88ef4a06"} Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.222272 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" event={"ID":"4e555346-7e76-4f4c-a996-ce05d506d038","Type":"ContainerDied","Data":"54432e9d7e183902aaa06a5a7cb3755567458ed5a9245b1e283803a1a8fa4365"} Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.222290 4907 scope.go:117] "RemoveContainer" containerID="3be33b0687af284ab518f192ad5fecb49160e0347fe2bc8fb457320c88ef4a06" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.222330 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-h9rgl" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.244715 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7w4m2"] Nov 29 14:49:59 crc kubenswrapper[4907]: W1129 14:49:59.265130 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13de3308_18f0_431d_997a_9288da0f520a.slice/crio-057103142cf0ea736a459fb701582bac34c39878ab808c5c6fc1e633ce633536 WatchSource:0}: Error finding container 057103142cf0ea736a459fb701582bac34c39878ab808c5c6fc1e633ce633536: Status 404 returned error can't find the container with id 057103142cf0ea736a459fb701582bac34c39878ab808c5c6fc1e633ce633536 Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.277659 4907 scope.go:117] "RemoveContainer" containerID="1bb93d8aec80761ba9df7c1cd8a86592df51a805cf89937e41f5874cb909f74b" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.283559 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-h9rgl"] Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.293736 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-h9rgl"] Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.303640 4907 scope.go:117] "RemoveContainer" containerID="3be33b0687af284ab518f192ad5fecb49160e0347fe2bc8fb457320c88ef4a06" Nov 29 14:49:59 crc kubenswrapper[4907]: E1129 14:49:59.304036 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3be33b0687af284ab518f192ad5fecb49160e0347fe2bc8fb457320c88ef4a06\": container with ID starting with 3be33b0687af284ab518f192ad5fecb49160e0347fe2bc8fb457320c88ef4a06 not found: ID does not exist" containerID="3be33b0687af284ab518f192ad5fecb49160e0347fe2bc8fb457320c88ef4a06" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.304081 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be33b0687af284ab518f192ad5fecb49160e0347fe2bc8fb457320c88ef4a06"} err="failed to get container status \"3be33b0687af284ab518f192ad5fecb49160e0347fe2bc8fb457320c88ef4a06\": rpc error: code = NotFound desc = could not find container \"3be33b0687af284ab518f192ad5fecb49160e0347fe2bc8fb457320c88ef4a06\": container with ID starting with 3be33b0687af284ab518f192ad5fecb49160e0347fe2bc8fb457320c88ef4a06 not found: ID does not exist" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.304102 4907 scope.go:117] "RemoveContainer" containerID="1bb93d8aec80761ba9df7c1cd8a86592df51a805cf89937e41f5874cb909f74b" Nov 29 14:49:59 crc kubenswrapper[4907]: E1129 14:49:59.304310 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb93d8aec80761ba9df7c1cd8a86592df51a805cf89937e41f5874cb909f74b\": container with ID starting with 1bb93d8aec80761ba9df7c1cd8a86592df51a805cf89937e41f5874cb909f74b not found: ID does not exist" containerID="1bb93d8aec80761ba9df7c1cd8a86592df51a805cf89937e41f5874cb909f74b" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.304365 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb93d8aec80761ba9df7c1cd8a86592df51a805cf89937e41f5874cb909f74b"} err="failed to get container status \"1bb93d8aec80761ba9df7c1cd8a86592df51a805cf89937e41f5874cb909f74b\": rpc error: code = NotFound desc = could not find container \"1bb93d8aec80761ba9df7c1cd8a86592df51a805cf89937e41f5874cb909f74b\": container with ID starting with 1bb93d8aec80761ba9df7c1cd8a86592df51a805cf89937e41f5874cb909f74b not found: ID does not exist" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.357838 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Nov 29 14:49:59 crc kubenswrapper[4907]: E1129 14:49:59.358205 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e555346-7e76-4f4c-a996-ce05d506d038" containerName="dnsmasq-dns" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.358221 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e555346-7e76-4f4c-a996-ce05d506d038" containerName="dnsmasq-dns" Nov 29 14:49:59 crc kubenswrapper[4907]: E1129 14:49:59.358260 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e555346-7e76-4f4c-a996-ce05d506d038" containerName="init" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.358269 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e555346-7e76-4f4c-a996-ce05d506d038" containerName="init" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.358460 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e555346-7e76-4f4c-a996-ce05d506d038" containerName="dnsmasq-dns" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.373949 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.374071 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.376425 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.376666 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-nrf9d" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.377095 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.377891 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.399389 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.399481 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-lock\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.399509 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.399557 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rwzp\" (UniqueName: \"kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-kube-api-access-2rwzp\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.399676 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-cache\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.502489 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-cache\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.502586 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.503175 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-cache\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.503352 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-lock\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.503381 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.503424 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rwzp\" (UniqueName: \"kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-kube-api-access-2rwzp\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.503478 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Nov 29 14:49:59 crc kubenswrapper[4907]: E1129 14:49:59.503535 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 14:49:59 crc kubenswrapper[4907]: E1129 14:49:59.503563 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 14:49:59 crc kubenswrapper[4907]: E1129 14:49:59.503617 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift podName:fe027ad6-8a24-44b5-8bfb-732d5c8fe22a nodeName:}" failed. No retries permitted until 2025-11-29 14:50:00.00359534 +0000 UTC m=+1297.990433052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift") pod "swift-storage-0" (UID: "fe027ad6-8a24-44b5-8bfb-732d5c8fe22a") : configmap "swift-ring-files" not found Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.503758 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-lock\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.530155 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rwzp\" (UniqueName: \"kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-kube-api-access-2rwzp\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:49:59 crc kubenswrapper[4907]: I1129 14:49:59.531152 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:50:00 crc kubenswrapper[4907]: I1129 14:50:00.013406 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:50:00 crc kubenswrapper[4907]: E1129 14:50:00.013558 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 14:50:00 crc kubenswrapper[4907]: E1129 14:50:00.013807 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 14:50:00 crc kubenswrapper[4907]: E1129 14:50:00.013854 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift podName:fe027ad6-8a24-44b5-8bfb-732d5c8fe22a nodeName:}" failed. No retries permitted until 2025-11-29 14:50:01.013837497 +0000 UTC m=+1299.000675149 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift") pod "swift-storage-0" (UID: "fe027ad6-8a24-44b5-8bfb-732d5c8fe22a") : configmap "swift-ring-files" not found Nov 29 14:50:00 crc kubenswrapper[4907]: I1129 14:50:00.233693 4907 generic.go:334] "Generic (PLEG): container finished" podID="13de3308-18f0-431d-997a-9288da0f520a" containerID="1876b9c8bc61ee8b5e89e654733f2b34b0e78c02cc8a8d312893b15e282bc3b4" exitCode=0 Nov 29 14:50:00 crc kubenswrapper[4907]: I1129 14:50:00.233813 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7w4m2" event={"ID":"13de3308-18f0-431d-997a-9288da0f520a","Type":"ContainerDied","Data":"1876b9c8bc61ee8b5e89e654733f2b34b0e78c02cc8a8d312893b15e282bc3b4"} Nov 29 14:50:00 crc kubenswrapper[4907]: I1129 14:50:00.233895 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7w4m2" event={"ID":"13de3308-18f0-431d-997a-9288da0f520a","Type":"ContainerStarted","Data":"057103142cf0ea736a459fb701582bac34c39878ab808c5c6fc1e633ce633536"} Nov 29 14:50:00 crc kubenswrapper[4907]: I1129 14:50:00.237684 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fc8b80af-94d6-4c43-887b-07aafa877200","Type":"ContainerStarted","Data":"65ab3236b03360b7036fd736c5b01512e2b1b2a3cfd809eb928f708150326cb0"} Nov 29 14:50:00 crc kubenswrapper[4907]: I1129 14:50:00.238652 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 29 14:50:00 crc kubenswrapper[4907]: I1129 14:50:00.281665 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.741259021 podStartE2EDuration="4.281647273s" podCreationTimestamp="2025-11-29 14:49:56 +0000 UTC" firstStartedPulling="2025-11-29 14:49:57.135562728 +0000 UTC m=+1295.122400380" lastFinishedPulling="2025-11-29 14:49:58.67595098 +0000 UTC m=+1296.662788632" observedRunningTime="2025-11-29 14:50:00.278778702 +0000 UTC m=+1298.265616344" watchObservedRunningTime="2025-11-29 14:50:00.281647273 +0000 UTC m=+1298.268484925" Nov 29 14:50:00 crc kubenswrapper[4907]: I1129 14:50:00.317725 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 29 14:50:00 crc kubenswrapper[4907]: I1129 14:50:00.425030 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 29 14:50:00 crc kubenswrapper[4907]: I1129 14:50:00.494873 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e555346-7e76-4f4c-a996-ce05d506d038" path="/var/lib/kubelet/pods/4e555346-7e76-4f4c-a996-ce05d506d038/volumes" Nov 29 14:50:00 crc kubenswrapper[4907]: I1129 14:50:00.649605 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:50:01 crc kubenswrapper[4907]: I1129 14:50:01.038983 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:50:01 crc kubenswrapper[4907]: E1129 14:50:01.039323 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 14:50:01 crc kubenswrapper[4907]: E1129 14:50:01.039342 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 14:50:01 crc kubenswrapper[4907]: E1129 14:50:01.039403 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift podName:fe027ad6-8a24-44b5-8bfb-732d5c8fe22a nodeName:}" failed. No retries permitted until 2025-11-29 14:50:03.039386033 +0000 UTC m=+1301.026223695 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift") pod "swift-storage-0" (UID: "fe027ad6-8a24-44b5-8bfb-732d5c8fe22a") : configmap "swift-ring-files" not found Nov 29 14:50:01 crc kubenswrapper[4907]: I1129 14:50:01.252018 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7w4m2" event={"ID":"13de3308-18f0-431d-997a-9288da0f520a","Type":"ContainerStarted","Data":"e957cbcc59c7dceca9e4424e4cb90793f779bc096698fbcc57511e06c8b7ffc1"} Nov 29 14:50:01 crc kubenswrapper[4907]: I1129 14:50:01.252312 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:50:01 crc kubenswrapper[4907]: I1129 14:50:01.273068 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-7w4m2" podStartSLOduration=3.273033277 podStartE2EDuration="3.273033277s" podCreationTimestamp="2025-11-29 14:49:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:01.268413657 +0000 UTC m=+1299.255251309" watchObservedRunningTime="2025-11-29 14:50:01.273033277 +0000 UTC m=+1299.259870929" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.094526 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:50:03 crc kubenswrapper[4907]: E1129 14:50:03.094749 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 14:50:03 crc kubenswrapper[4907]: E1129 14:50:03.095459 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 14:50:03 crc kubenswrapper[4907]: E1129 14:50:03.095555 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift podName:fe027ad6-8a24-44b5-8bfb-732d5c8fe22a nodeName:}" failed. No retries permitted until 2025-11-29 14:50:07.095518149 +0000 UTC m=+1305.082355801 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift") pod "swift-storage-0" (UID: "fe027ad6-8a24-44b5-8bfb-732d5c8fe22a") : configmap "swift-ring-files" not found Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.317377 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-c98bl"] Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.319390 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.334890 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-c98bl"] Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.354086 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.354537 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.355608 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.384033 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-nfb2t"] Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.385575 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.402397 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-c98bl"] Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.402490 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-etc-swift\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.402542 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-swiftconf\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.402613 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-ring-data-devices\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.402663 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-combined-ca-bundle\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.402746 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-dispersionconf\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.402786 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqb4h\" (UniqueName: \"kubernetes.io/projected/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-kube-api-access-qqb4h\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.402862 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-scripts\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: E1129 14:50:03.403644 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-qqb4h ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-c98bl" podUID="6c5f28a3-3510-406a-81b2-7b7f8e2e2aef" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.416290 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nfb2t"] Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.504560 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-combined-ca-bundle\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.504617 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5bad7a6-9301-4f9e-8303-ae377c4f909f-scripts\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.504646 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v85q\" (UniqueName: \"kubernetes.io/projected/b5bad7a6-9301-4f9e-8303-ae377c4f909f-kube-api-access-4v85q\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.504668 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5bad7a6-9301-4f9e-8303-ae377c4f909f-ring-data-devices\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.504714 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-dispersionconf\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.504924 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqb4h\" (UniqueName: \"kubernetes.io/projected/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-kube-api-access-qqb4h\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.504945 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-combined-ca-bundle\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.504978 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-swiftconf\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.505040 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-scripts\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.505055 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5bad7a6-9301-4f9e-8303-ae377c4f909f-etc-swift\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.505178 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-etc-swift\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.505219 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-swiftconf\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.505243 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-dispersionconf\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.505284 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-ring-data-devices\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.506088 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-ring-data-devices\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.506645 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-etc-swift\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.506754 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-scripts\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.511617 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-swiftconf\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.511886 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-dispersionconf\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.514108 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-combined-ca-bundle\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.529210 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqb4h\" (UniqueName: \"kubernetes.io/projected/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-kube-api-access-qqb4h\") pod \"swift-ring-rebalance-c98bl\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.608845 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5bad7a6-9301-4f9e-8303-ae377c4f909f-scripts\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.608994 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v85q\" (UniqueName: \"kubernetes.io/projected/b5bad7a6-9301-4f9e-8303-ae377c4f909f-kube-api-access-4v85q\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.609050 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5bad7a6-9301-4f9e-8303-ae377c4f909f-ring-data-devices\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.609197 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-combined-ca-bundle\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.609256 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-swiftconf\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.609382 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5bad7a6-9301-4f9e-8303-ae377c4f909f-etc-swift\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.609462 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-dispersionconf\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.612858 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5bad7a6-9301-4f9e-8303-ae377c4f909f-ring-data-devices\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.613396 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5bad7a6-9301-4f9e-8303-ae377c4f909f-scripts\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.613763 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5bad7a6-9301-4f9e-8303-ae377c4f909f-etc-swift\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.617709 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-swiftconf\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.618973 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-combined-ca-bundle\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.624114 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-dispersionconf\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.638216 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v85q\" (UniqueName: \"kubernetes.io/projected/b5bad7a6-9301-4f9e-8303-ae377c4f909f-kube-api-access-4v85q\") pod \"swift-ring-rebalance-nfb2t\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:03 crc kubenswrapper[4907]: I1129 14:50:03.708968 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.304531 4907 generic.go:334] "Generic (PLEG): container finished" podID="550beb06-c1d9-4568-bb4e-66ff9134cb8e" containerID="57e9e8cb7a73ad419533e89d66570ed4fe9e23124e85292737bf8eb73653d7b3" exitCode=0 Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.304803 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"550beb06-c1d9-4568-bb4e-66ff9134cb8e","Type":"ContainerDied","Data":"57e9e8cb7a73ad419533e89d66570ed4fe9e23124e85292737bf8eb73653d7b3"} Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.306906 4907 generic.go:334] "Generic (PLEG): container finished" podID="8aee0179-2960-486d-8129-1e928d55a29f" containerID="b6497214a1f83cc739bcf81c7e151db9861c2777d57157ead4c53dcba818c0a6" exitCode=0 Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.306998 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.306989 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8aee0179-2960-486d-8129-1e928d55a29f","Type":"ContainerDied","Data":"b6497214a1f83cc739bcf81c7e151db9861c2777d57157ead4c53dcba818c0a6"} Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.321798 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.432936 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-etc-swift\") pod \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.432974 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-swiftconf\") pod \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.433121 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-scripts\") pod \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.433208 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-combined-ca-bundle\") pod \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.433317 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-dispersionconf\") pod \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.433356 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-ring-data-devices\") pod \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.433394 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqb4h\" (UniqueName: \"kubernetes.io/projected/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-kube-api-access-qqb4h\") pod \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\" (UID: \"6c5f28a3-3510-406a-81b2-7b7f8e2e2aef\") " Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.434002 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-scripts" (OuterVolumeSpecName: "scripts") pod "6c5f28a3-3510-406a-81b2-7b7f8e2e2aef" (UID: "6c5f28a3-3510-406a-81b2-7b7f8e2e2aef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.434167 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.434253 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6c5f28a3-3510-406a-81b2-7b7f8e2e2aef" (UID: "6c5f28a3-3510-406a-81b2-7b7f8e2e2aef"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.436753 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6c5f28a3-3510-406a-81b2-7b7f8e2e2aef" (UID: "6c5f28a3-3510-406a-81b2-7b7f8e2e2aef"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.439898 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6c5f28a3-3510-406a-81b2-7b7f8e2e2aef" (UID: "6c5f28a3-3510-406a-81b2-7b7f8e2e2aef"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.444240 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-kube-api-access-qqb4h" (OuterVolumeSpecName: "kube-api-access-qqb4h") pod "6c5f28a3-3510-406a-81b2-7b7f8e2e2aef" (UID: "6c5f28a3-3510-406a-81b2-7b7f8e2e2aef"). InnerVolumeSpecName "kube-api-access-qqb4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.444498 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c5f28a3-3510-406a-81b2-7b7f8e2e2aef" (UID: "6c5f28a3-3510-406a-81b2-7b7f8e2e2aef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.445741 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6c5f28a3-3510-406a-81b2-7b7f8e2e2aef" (UID: "6c5f28a3-3510-406a-81b2-7b7f8e2e2aef"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.531121 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.533599 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.535991 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.536018 4907 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.536033 4907 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.536046 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqb4h\" (UniqueName: \"kubernetes.io/projected/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-kube-api-access-qqb4h\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.536058 4907 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.536070 4907 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:04 crc kubenswrapper[4907]: I1129 14:50:04.618541 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.001231 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6b5dbd778c-bhss9" podUID="5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1" containerName="console" containerID="cri-o://6e2efe74a40325503da6bcaeba21c72fdef5b0d06629709671596369b6dd6106" gracePeriod=15 Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.237713 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nfb2t"] Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.321185 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b5dbd778c-bhss9_5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1/console/0.log" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.321227 4907 generic.go:334] "Generic (PLEG): container finished" podID="5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1" containerID="6e2efe74a40325503da6bcaeba21c72fdef5b0d06629709671596369b6dd6106" exitCode=2 Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.321282 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b5dbd778c-bhss9" event={"ID":"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1","Type":"ContainerDied","Data":"6e2efe74a40325503da6bcaeba21c72fdef5b0d06629709671596369b6dd6106"} Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.323172 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2e724b-ab17-45b3-a5ec-c43bf54e935d","Type":"ContainerStarted","Data":"71d9ae3ff5f122b40076021f78adaacc594923e8bbd397b5e8e8a878fe116bf5"} Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.324500 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nfb2t" event={"ID":"b5bad7a6-9301-4f9e-8303-ae377c4f909f","Type":"ContainerStarted","Data":"e204c0da95eeaedd25ced9e7033f8d1f086de9a44f3410c7c1468fe57ede7599"} Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.330394 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8aee0179-2960-486d-8129-1e928d55a29f","Type":"ContainerStarted","Data":"5e8faf209da0c431e5051a080910785ed675c3ec5f414d6d2f380f167c75b11a"} Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.331449 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.338641 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"550beb06-c1d9-4568-bb4e-66ff9134cb8e","Type":"ContainerStarted","Data":"028361a10b98546053368bb93dba0cba2998d73e54ae4ff700656f76950bbf7b"} Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.339487 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.339577 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c98bl" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.365361 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.118719682 podStartE2EDuration="54.365341372s" podCreationTimestamp="2025-11-29 14:49:11 +0000 UTC" firstStartedPulling="2025-11-29 14:49:13.856313525 +0000 UTC m=+1251.843151177" lastFinishedPulling="2025-11-29 14:49:30.102935205 +0000 UTC m=+1268.089772867" observedRunningTime="2025-11-29 14:50:05.355580043 +0000 UTC m=+1303.342417695" watchObservedRunningTime="2025-11-29 14:50:05.365341372 +0000 UTC m=+1303.352179034" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.387397 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.63543291 podStartE2EDuration="54.387372839s" podCreationTimestamp="2025-11-29 14:49:11 +0000 UTC" firstStartedPulling="2025-11-29 14:49:13.426614987 +0000 UTC m=+1251.413452639" lastFinishedPulling="2025-11-29 14:49:30.178554916 +0000 UTC m=+1268.165392568" observedRunningTime="2025-11-29 14:50:05.381203074 +0000 UTC m=+1303.368040726" watchObservedRunningTime="2025-11-29 14:50:05.387372839 +0000 UTC m=+1303.374210491" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.446482 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-c98bl"] Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.448973 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-c98bl"] Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.475516 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.631206 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b5dbd778c-bhss9_5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1/console/0.log" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.631536 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.669981 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-config\") pod \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.670020 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-trusted-ca-bundle\") pod \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.670126 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-service-ca\") pod \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.670205 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-serving-cert\") pod \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.670233 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndzkf\" (UniqueName: \"kubernetes.io/projected/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-kube-api-access-ndzkf\") pod \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.670267 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-oauth-config\") pod \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.670293 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-oauth-serving-cert\") pod \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\" (UID: \"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1\") " Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.671410 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1" (UID: "5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.671778 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-config" (OuterVolumeSpecName: "console-config") pod "5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1" (UID: "5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.672109 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1" (UID: "5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.672418 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-service-ca" (OuterVolumeSpecName: "service-ca") pod "5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1" (UID: "5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.682803 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-kube-api-access-ndzkf" (OuterVolumeSpecName: "kube-api-access-ndzkf") pod "5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1" (UID: "5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1"). InnerVolumeSpecName "kube-api-access-ndzkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.682889 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1" (UID: "5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.684566 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1" (UID: "5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.727308 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5da6-account-create-update-7jqbf"] Nov 29 14:50:05 crc kubenswrapper[4907]: E1129 14:50:05.727866 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1" containerName="console" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.727892 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1" containerName="console" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.728169 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1" containerName="console" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.729138 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5da6-account-create-update-7jqbf" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.734796 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.747681 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5da6-account-create-update-7jqbf"] Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.758811 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-b9xn8"] Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.760207 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-b9xn8" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.773161 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ab63941-4052-4105-b09e-2bd04a34ed2d-operator-scripts\") pod \"keystone-5da6-account-create-update-7jqbf\" (UID: \"0ab63941-4052-4105-b09e-2bd04a34ed2d\") " pod="openstack/keystone-5da6-account-create-update-7jqbf" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.773316 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq92j\" (UniqueName: \"kubernetes.io/projected/0ab63941-4052-4105-b09e-2bd04a34ed2d-kube-api-access-vq92j\") pod \"keystone-5da6-account-create-update-7jqbf\" (UID: \"0ab63941-4052-4105-b09e-2bd04a34ed2d\") " pod="openstack/keystone-5da6-account-create-update-7jqbf" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.773449 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.773468 4907 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.773479 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndzkf\" (UniqueName: \"kubernetes.io/projected/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-kube-api-access-ndzkf\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.773488 4907 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.773497 4907 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.773506 4907 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-console-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.773514 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.793897 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-b9xn8"] Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.875468 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4a97393-4ffa-49d7-a070-aa2758fe10ed-operator-scripts\") pod \"keystone-db-create-b9xn8\" (UID: \"f4a97393-4ffa-49d7-a070-aa2758fe10ed\") " pod="openstack/keystone-db-create-b9xn8" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.876033 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmxl2\" (UniqueName: \"kubernetes.io/projected/f4a97393-4ffa-49d7-a070-aa2758fe10ed-kube-api-access-pmxl2\") pod \"keystone-db-create-b9xn8\" (UID: \"f4a97393-4ffa-49d7-a070-aa2758fe10ed\") " pod="openstack/keystone-db-create-b9xn8" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.876225 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq92j\" (UniqueName: \"kubernetes.io/projected/0ab63941-4052-4105-b09e-2bd04a34ed2d-kube-api-access-vq92j\") pod \"keystone-5da6-account-create-update-7jqbf\" (UID: \"0ab63941-4052-4105-b09e-2bd04a34ed2d\") " pod="openstack/keystone-5da6-account-create-update-7jqbf" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.876414 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ab63941-4052-4105-b09e-2bd04a34ed2d-operator-scripts\") pod \"keystone-5da6-account-create-update-7jqbf\" (UID: \"0ab63941-4052-4105-b09e-2bd04a34ed2d\") " pod="openstack/keystone-5da6-account-create-update-7jqbf" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.877474 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ab63941-4052-4105-b09e-2bd04a34ed2d-operator-scripts\") pod \"keystone-5da6-account-create-update-7jqbf\" (UID: \"0ab63941-4052-4105-b09e-2bd04a34ed2d\") " pod="openstack/keystone-5da6-account-create-update-7jqbf" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.903032 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq92j\" (UniqueName: \"kubernetes.io/projected/0ab63941-4052-4105-b09e-2bd04a34ed2d-kube-api-access-vq92j\") pod \"keystone-5da6-account-create-update-7jqbf\" (UID: \"0ab63941-4052-4105-b09e-2bd04a34ed2d\") " pod="openstack/keystone-5da6-account-create-update-7jqbf" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.961574 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-kbxgc"] Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.966698 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kbxgc" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.978238 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4a97393-4ffa-49d7-a070-aa2758fe10ed-operator-scripts\") pod \"keystone-db-create-b9xn8\" (UID: \"f4a97393-4ffa-49d7-a070-aa2758fe10ed\") " pod="openstack/keystone-db-create-b9xn8" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.978816 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmxl2\" (UniqueName: \"kubernetes.io/projected/f4a97393-4ffa-49d7-a070-aa2758fe10ed-kube-api-access-pmxl2\") pod \"keystone-db-create-b9xn8\" (UID: \"f4a97393-4ffa-49d7-a070-aa2758fe10ed\") " pod="openstack/keystone-db-create-b9xn8" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.980173 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4a97393-4ffa-49d7-a070-aa2758fe10ed-operator-scripts\") pod \"keystone-db-create-b9xn8\" (UID: \"f4a97393-4ffa-49d7-a070-aa2758fe10ed\") " pod="openstack/keystone-db-create-b9xn8" Nov 29 14:50:05 crc kubenswrapper[4907]: I1129 14:50:05.981849 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kbxgc"] Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.000035 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmxl2\" (UniqueName: \"kubernetes.io/projected/f4a97393-4ffa-49d7-a070-aa2758fe10ed-kube-api-access-pmxl2\") pod \"keystone-db-create-b9xn8\" (UID: \"f4a97393-4ffa-49d7-a070-aa2758fe10ed\") " pod="openstack/keystone-db-create-b9xn8" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.081037 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c15eb212-5dab-4f9f-bea0-6f3899a36a8b-operator-scripts\") pod \"placement-db-create-kbxgc\" (UID: \"c15eb212-5dab-4f9f-bea0-6f3899a36a8b\") " pod="openstack/placement-db-create-kbxgc" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.081179 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2xnm\" (UniqueName: \"kubernetes.io/projected/c15eb212-5dab-4f9f-bea0-6f3899a36a8b-kube-api-access-k2xnm\") pod \"placement-db-create-kbxgc\" (UID: \"c15eb212-5dab-4f9f-bea0-6f3899a36a8b\") " pod="openstack/placement-db-create-kbxgc" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.083018 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b489-account-create-update-ntbwq"] Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.085264 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b489-account-create-update-ntbwq" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.085660 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5da6-account-create-update-7jqbf" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.089899 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.095883 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-b9xn8" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.108054 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b489-account-create-update-ntbwq"] Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.182653 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a891cd77-26e1-42f2-bac1-dc68dd51d2d3-operator-scripts\") pod \"placement-b489-account-create-update-ntbwq\" (UID: \"a891cd77-26e1-42f2-bac1-dc68dd51d2d3\") " pod="openstack/placement-b489-account-create-update-ntbwq" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.182793 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c15eb212-5dab-4f9f-bea0-6f3899a36a8b-operator-scripts\") pod \"placement-db-create-kbxgc\" (UID: \"c15eb212-5dab-4f9f-bea0-6f3899a36a8b\") " pod="openstack/placement-db-create-kbxgc" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.182896 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2xnm\" (UniqueName: \"kubernetes.io/projected/c15eb212-5dab-4f9f-bea0-6f3899a36a8b-kube-api-access-k2xnm\") pod \"placement-db-create-kbxgc\" (UID: \"c15eb212-5dab-4f9f-bea0-6f3899a36a8b\") " pod="openstack/placement-db-create-kbxgc" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.183032 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwzbj\" (UniqueName: \"kubernetes.io/projected/a891cd77-26e1-42f2-bac1-dc68dd51d2d3-kube-api-access-cwzbj\") pod \"placement-b489-account-create-update-ntbwq\" (UID: \"a891cd77-26e1-42f2-bac1-dc68dd51d2d3\") " pod="openstack/placement-b489-account-create-update-ntbwq" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.184024 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c15eb212-5dab-4f9f-bea0-6f3899a36a8b-operator-scripts\") pod \"placement-db-create-kbxgc\" (UID: \"c15eb212-5dab-4f9f-bea0-6f3899a36a8b\") " pod="openstack/placement-db-create-kbxgc" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.206491 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2xnm\" (UniqueName: \"kubernetes.io/projected/c15eb212-5dab-4f9f-bea0-6f3899a36a8b-kube-api-access-k2xnm\") pod \"placement-db-create-kbxgc\" (UID: \"c15eb212-5dab-4f9f-bea0-6f3899a36a8b\") " pod="openstack/placement-db-create-kbxgc" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.284744 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwzbj\" (UniqueName: \"kubernetes.io/projected/a891cd77-26e1-42f2-bac1-dc68dd51d2d3-kube-api-access-cwzbj\") pod \"placement-b489-account-create-update-ntbwq\" (UID: \"a891cd77-26e1-42f2-bac1-dc68dd51d2d3\") " pod="openstack/placement-b489-account-create-update-ntbwq" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.284820 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a891cd77-26e1-42f2-bac1-dc68dd51d2d3-operator-scripts\") pod \"placement-b489-account-create-update-ntbwq\" (UID: \"a891cd77-26e1-42f2-bac1-dc68dd51d2d3\") " pod="openstack/placement-b489-account-create-update-ntbwq" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.285844 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a891cd77-26e1-42f2-bac1-dc68dd51d2d3-operator-scripts\") pod \"placement-b489-account-create-update-ntbwq\" (UID: \"a891cd77-26e1-42f2-bac1-dc68dd51d2d3\") " pod="openstack/placement-b489-account-create-update-ntbwq" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.295861 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kbxgc" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.303091 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwzbj\" (UniqueName: \"kubernetes.io/projected/a891cd77-26e1-42f2-bac1-dc68dd51d2d3-kube-api-access-cwzbj\") pod \"placement-b489-account-create-update-ntbwq\" (UID: \"a891cd77-26e1-42f2-bac1-dc68dd51d2d3\") " pod="openstack/placement-b489-account-create-update-ntbwq" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.357321 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b5dbd778c-bhss9_5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1/console/0.log" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.357981 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b5dbd778c-bhss9" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.358619 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b5dbd778c-bhss9" event={"ID":"5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1","Type":"ContainerDied","Data":"73a5782c2d5e14a6436d7cb098887d7e400d4d19553f9915d6e7d163283245c8"} Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.358687 4907 scope.go:117] "RemoveContainer" containerID="6e2efe74a40325503da6bcaeba21c72fdef5b0d06629709671596369b6dd6106" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.407393 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b5dbd778c-bhss9"] Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.409339 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b489-account-create-update-ntbwq" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.415937 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b5dbd778c-bhss9"] Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.496298 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1" path="/var/lib/kubelet/pods/5a087249-46ad-49cf-8ad9-cd3f2c8dc0c1/volumes" Nov 29 14:50:06 crc kubenswrapper[4907]: I1129 14:50:06.496890 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c5f28a3-3510-406a-81b2-7b7f8e2e2aef" path="/var/lib/kubelet/pods/6c5f28a3-3510-406a-81b2-7b7f8e2e2aef/volumes" Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:06.650082 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5da6-account-create-update-7jqbf"] Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:06.711261 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-b9xn8"] Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:06.938208 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kbxgc"] Nov 29 14:50:07 crc kubenswrapper[4907]: W1129 14:50:06.957307 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc15eb212_5dab_4f9f_bea0_6f3899a36a8b.slice/crio-f133ab93107cb02d77a41dcbc5fac8ad49334c02b80f69b7f7a551c3fecf3875 WatchSource:0}: Error finding container f133ab93107cb02d77a41dcbc5fac8ad49334c02b80f69b7f7a551c3fecf3875: Status 404 returned error can't find the container with id f133ab93107cb02d77a41dcbc5fac8ad49334c02b80f69b7f7a551c3fecf3875 Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:06.960713 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b489-account-create-update-ntbwq"] Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:07.107421 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:50:07 crc kubenswrapper[4907]: E1129 14:50:07.107615 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 14:50:07 crc kubenswrapper[4907]: E1129 14:50:07.107646 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 14:50:07 crc kubenswrapper[4907]: E1129 14:50:07.107707 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift podName:fe027ad6-8a24-44b5-8bfb-732d5c8fe22a nodeName:}" failed. No retries permitted until 2025-11-29 14:50:15.1076886 +0000 UTC m=+1313.094526242 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift") pod "swift-storage-0" (UID: "fe027ad6-8a24-44b5-8bfb-732d5c8fe22a") : configmap "swift-ring-files" not found Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:07.371139 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kbxgc" event={"ID":"c15eb212-5dab-4f9f-bea0-6f3899a36a8b","Type":"ContainerStarted","Data":"f133ab93107cb02d77a41dcbc5fac8ad49334c02b80f69b7f7a551c3fecf3875"} Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:07.374150 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b489-account-create-update-ntbwq" event={"ID":"a891cd77-26e1-42f2-bac1-dc68dd51d2d3","Type":"ContainerStarted","Data":"78c0821e8ddfc60c4e95721d16a8d84dced7bafecad8686ac46f9be847883adb"} Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:07.377616 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-b9xn8" event={"ID":"f4a97393-4ffa-49d7-a070-aa2758fe10ed","Type":"ContainerStarted","Data":"3c508cc8b7bb90b5c9f338c6c452b78303a9c108d1b1a328b6142efc7f16d225"} Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:07.377716 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-b9xn8" event={"ID":"f4a97393-4ffa-49d7-a070-aa2758fe10ed","Type":"ContainerStarted","Data":"a53c37cbcb72fd2fca29ea68a311d9d231b96522e1459e261748950a642c066d"} Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:07.390596 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5da6-account-create-update-7jqbf" event={"ID":"0ab63941-4052-4105-b09e-2bd04a34ed2d","Type":"ContainerStarted","Data":"9129b513c0f177b903b12982ef113951dfd5446c789a2b50a6e7901fb704ada9"} Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:07.390649 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5da6-account-create-update-7jqbf" event={"ID":"0ab63941-4052-4105-b09e-2bd04a34ed2d","Type":"ContainerStarted","Data":"67f416e989b2329ef641c1d8214943ea32099fa5ae26fa4a18bb25d8a5c70b8e"} Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:07.405625 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-b9xn8" podStartSLOduration=2.405608561 podStartE2EDuration="2.405608561s" podCreationTimestamp="2025-11-29 14:50:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:07.395806161 +0000 UTC m=+1305.382643853" watchObservedRunningTime="2025-11-29 14:50:07.405608561 +0000 UTC m=+1305.392446213" Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:07.438831 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5da6-account-create-update-7jqbf" podStartSLOduration=2.438807757 podStartE2EDuration="2.438807757s" podCreationTimestamp="2025-11-29 14:50:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:07.422720389 +0000 UTC m=+1305.409558051" watchObservedRunningTime="2025-11-29 14:50:07.438807757 +0000 UTC m=+1305.425645429" Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:07.919874 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-lfz28"] Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:07.921199 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-lfz28" Nov 29 14:50:07 crc kubenswrapper[4907]: I1129 14:50:07.933496 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-lfz28"] Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.033941 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dac16ae-ba20-4405-9e83-73dae3db6f5f-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-lfz28\" (UID: \"0dac16ae-ba20-4405-9e83-73dae3db6f5f\") " pod="openstack/mysqld-exporter-openstack-db-create-lfz28" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.034109 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w64w8\" (UniqueName: \"kubernetes.io/projected/0dac16ae-ba20-4405-9e83-73dae3db6f5f-kube-api-access-w64w8\") pod \"mysqld-exporter-openstack-db-create-lfz28\" (UID: \"0dac16ae-ba20-4405-9e83-73dae3db6f5f\") " pod="openstack/mysqld-exporter-openstack-db-create-lfz28" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.136458 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w64w8\" (UniqueName: \"kubernetes.io/projected/0dac16ae-ba20-4405-9e83-73dae3db6f5f-kube-api-access-w64w8\") pod \"mysqld-exporter-openstack-db-create-lfz28\" (UID: \"0dac16ae-ba20-4405-9e83-73dae3db6f5f\") " pod="openstack/mysqld-exporter-openstack-db-create-lfz28" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.136570 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dac16ae-ba20-4405-9e83-73dae3db6f5f-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-lfz28\" (UID: \"0dac16ae-ba20-4405-9e83-73dae3db6f5f\") " pod="openstack/mysqld-exporter-openstack-db-create-lfz28" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.137182 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dac16ae-ba20-4405-9e83-73dae3db6f5f-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-lfz28\" (UID: \"0dac16ae-ba20-4405-9e83-73dae3db6f5f\") " pod="openstack/mysqld-exporter-openstack-db-create-lfz28" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.155076 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w64w8\" (UniqueName: \"kubernetes.io/projected/0dac16ae-ba20-4405-9e83-73dae3db6f5f-kube-api-access-w64w8\") pod \"mysqld-exporter-openstack-db-create-lfz28\" (UID: \"0dac16ae-ba20-4405-9e83-73dae3db6f5f\") " pod="openstack/mysqld-exporter-openstack-db-create-lfz28" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.233217 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-b9b0-account-create-update-gkhzg"] Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.234597 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b9b0-account-create-update-gkhzg" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.238072 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.244590 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-lfz28" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.250516 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-b9b0-account-create-update-gkhzg"] Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.341416 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18e46549-a315-4113-a0d2-5aafb96a7f12-operator-scripts\") pod \"mysqld-exporter-b9b0-account-create-update-gkhzg\" (UID: \"18e46549-a315-4113-a0d2-5aafb96a7f12\") " pod="openstack/mysqld-exporter-b9b0-account-create-update-gkhzg" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.341498 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbqrd\" (UniqueName: \"kubernetes.io/projected/18e46549-a315-4113-a0d2-5aafb96a7f12-kube-api-access-zbqrd\") pod \"mysqld-exporter-b9b0-account-create-update-gkhzg\" (UID: \"18e46549-a315-4113-a0d2-5aafb96a7f12\") " pod="openstack/mysqld-exporter-b9b0-account-create-update-gkhzg" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.410072 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4a97393-4ffa-49d7-a070-aa2758fe10ed" containerID="3c508cc8b7bb90b5c9f338c6c452b78303a9c108d1b1a328b6142efc7f16d225" exitCode=0 Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.410124 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-b9xn8" event={"ID":"f4a97393-4ffa-49d7-a070-aa2758fe10ed","Type":"ContainerDied","Data":"3c508cc8b7bb90b5c9f338c6c452b78303a9c108d1b1a328b6142efc7f16d225"} Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.411940 4907 generic.go:334] "Generic (PLEG): container finished" podID="0ab63941-4052-4105-b09e-2bd04a34ed2d" containerID="9129b513c0f177b903b12982ef113951dfd5446c789a2b50a6e7901fb704ada9" exitCode=0 Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.412016 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5da6-account-create-update-7jqbf" event={"ID":"0ab63941-4052-4105-b09e-2bd04a34ed2d","Type":"ContainerDied","Data":"9129b513c0f177b903b12982ef113951dfd5446c789a2b50a6e7901fb704ada9"} Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.416859 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2e724b-ab17-45b3-a5ec-c43bf54e935d","Type":"ContainerStarted","Data":"385b5d09da1d545cd7645ba514e5bb855e37383ef4da8fcf6cf96958c4c84db0"} Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.418805 4907 generic.go:334] "Generic (PLEG): container finished" podID="c15eb212-5dab-4f9f-bea0-6f3899a36a8b" containerID="3c11463cbe3643f16f5afcdeba176dc2fba8065478e6aeb4d300edda040e1205" exitCode=0 Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.418848 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kbxgc" event={"ID":"c15eb212-5dab-4f9f-bea0-6f3899a36a8b","Type":"ContainerDied","Data":"3c11463cbe3643f16f5afcdeba176dc2fba8065478e6aeb4d300edda040e1205"} Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.444947 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18e46549-a315-4113-a0d2-5aafb96a7f12-operator-scripts\") pod \"mysqld-exporter-b9b0-account-create-update-gkhzg\" (UID: \"18e46549-a315-4113-a0d2-5aafb96a7f12\") " pod="openstack/mysqld-exporter-b9b0-account-create-update-gkhzg" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.445016 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbqrd\" (UniqueName: \"kubernetes.io/projected/18e46549-a315-4113-a0d2-5aafb96a7f12-kube-api-access-zbqrd\") pod \"mysqld-exporter-b9b0-account-create-update-gkhzg\" (UID: \"18e46549-a315-4113-a0d2-5aafb96a7f12\") " pod="openstack/mysqld-exporter-b9b0-account-create-update-gkhzg" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.446308 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18e46549-a315-4113-a0d2-5aafb96a7f12-operator-scripts\") pod \"mysqld-exporter-b9b0-account-create-update-gkhzg\" (UID: \"18e46549-a315-4113-a0d2-5aafb96a7f12\") " pod="openstack/mysqld-exporter-b9b0-account-create-update-gkhzg" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.463529 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbqrd\" (UniqueName: \"kubernetes.io/projected/18e46549-a315-4113-a0d2-5aafb96a7f12-kube-api-access-zbqrd\") pod \"mysqld-exporter-b9b0-account-create-update-gkhzg\" (UID: \"18e46549-a315-4113-a0d2-5aafb96a7f12\") " pod="openstack/mysqld-exporter-b9b0-account-create-update-gkhzg" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.551745 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b9b0-account-create-update-gkhzg" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.627559 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.688771 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s2mxh"] Nov 29 14:50:08 crc kubenswrapper[4907]: I1129 14:50:08.688989 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" podUID="e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a" containerName="dnsmasq-dns" containerID="cri-o://86a936c5d892dcca038a100252784738ca3de880645f6a9565b680176372c593" gracePeriod=10 Nov 29 14:50:09 crc kubenswrapper[4907]: E1129 14:50:09.005870 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda891cd77_26e1_42f2_bac1_dc68dd51d2d3.slice/crio-conmon-6dd0bd74d1a2a61be5f1b19b14801f8c85942a65cee72f00839051d566dafa6b.scope\": RecentStats: unable to find data in memory cache]" Nov 29 14:50:09 crc kubenswrapper[4907]: I1129 14:50:09.433573 4907 generic.go:334] "Generic (PLEG): container finished" podID="a891cd77-26e1-42f2-bac1-dc68dd51d2d3" containerID="6dd0bd74d1a2a61be5f1b19b14801f8c85942a65cee72f00839051d566dafa6b" exitCode=0 Nov 29 14:50:09 crc kubenswrapper[4907]: I1129 14:50:09.433629 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b489-account-create-update-ntbwq" event={"ID":"a891cd77-26e1-42f2-bac1-dc68dd51d2d3","Type":"ContainerDied","Data":"6dd0bd74d1a2a61be5f1b19b14801f8c85942a65cee72f00839051d566dafa6b"} Nov 29 14:50:09 crc kubenswrapper[4907]: I1129 14:50:09.436608 4907 generic.go:334] "Generic (PLEG): container finished" podID="e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a" containerID="86a936c5d892dcca038a100252784738ca3de880645f6a9565b680176372c593" exitCode=0 Nov 29 14:50:09 crc kubenswrapper[4907]: I1129 14:50:09.436765 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" event={"ID":"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a","Type":"ContainerDied","Data":"86a936c5d892dcca038a100252784738ca3de880645f6a9565b680176372c593"} Nov 29 14:50:10 crc kubenswrapper[4907]: I1129 14:50:10.648659 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" podUID="e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: connect: connection refused" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.305067 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-k9rqw"] Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.308740 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-k9rqw" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.389565 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-k9rqw"] Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.417238 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faad6940-2c08-4b7a-bdc0-b00def56a777-operator-scripts\") pod \"glance-db-create-k9rqw\" (UID: \"faad6940-2c08-4b7a-bdc0-b00def56a777\") " pod="openstack/glance-db-create-k9rqw" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.417314 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs8pv\" (UniqueName: \"kubernetes.io/projected/faad6940-2c08-4b7a-bdc0-b00def56a777-kube-api-access-gs8pv\") pod \"glance-db-create-k9rqw\" (UID: \"faad6940-2c08-4b7a-bdc0-b00def56a777\") " pod="openstack/glance-db-create-k9rqw" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.423860 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6933-account-create-update-jjhlm"] Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.425296 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6933-account-create-update-jjhlm" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.426878 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.440005 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6933-account-create-update-jjhlm"] Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.518575 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs8pv\" (UniqueName: \"kubernetes.io/projected/faad6940-2c08-4b7a-bdc0-b00def56a777-kube-api-access-gs8pv\") pod \"glance-db-create-k9rqw\" (UID: \"faad6940-2c08-4b7a-bdc0-b00def56a777\") " pod="openstack/glance-db-create-k9rqw" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.518633 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xmch\" (UniqueName: \"kubernetes.io/projected/5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f-kube-api-access-9xmch\") pod \"glance-6933-account-create-update-jjhlm\" (UID: \"5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f\") " pod="openstack/glance-6933-account-create-update-jjhlm" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.518660 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f-operator-scripts\") pod \"glance-6933-account-create-update-jjhlm\" (UID: \"5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f\") " pod="openstack/glance-6933-account-create-update-jjhlm" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.518862 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faad6940-2c08-4b7a-bdc0-b00def56a777-operator-scripts\") pod \"glance-db-create-k9rqw\" (UID: \"faad6940-2c08-4b7a-bdc0-b00def56a777\") " pod="openstack/glance-db-create-k9rqw" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.522348 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faad6940-2c08-4b7a-bdc0-b00def56a777-operator-scripts\") pod \"glance-db-create-k9rqw\" (UID: \"faad6940-2c08-4b7a-bdc0-b00def56a777\") " pod="openstack/glance-db-create-k9rqw" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.544988 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs8pv\" (UniqueName: \"kubernetes.io/projected/faad6940-2c08-4b7a-bdc0-b00def56a777-kube-api-access-gs8pv\") pod \"glance-db-create-k9rqw\" (UID: \"faad6940-2c08-4b7a-bdc0-b00def56a777\") " pod="openstack/glance-db-create-k9rqw" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.620919 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xmch\" (UniqueName: \"kubernetes.io/projected/5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f-kube-api-access-9xmch\") pod \"glance-6933-account-create-update-jjhlm\" (UID: \"5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f\") " pod="openstack/glance-6933-account-create-update-jjhlm" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.620991 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f-operator-scripts\") pod \"glance-6933-account-create-update-jjhlm\" (UID: \"5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f\") " pod="openstack/glance-6933-account-create-update-jjhlm" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.622024 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f-operator-scripts\") pod \"glance-6933-account-create-update-jjhlm\" (UID: \"5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f\") " pod="openstack/glance-6933-account-create-update-jjhlm" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.648971 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xmch\" (UniqueName: \"kubernetes.io/projected/5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f-kube-api-access-9xmch\") pod \"glance-6933-account-create-update-jjhlm\" (UID: \"5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f\") " pod="openstack/glance-6933-account-create-update-jjhlm" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.697775 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-k9rqw" Nov 29 14:50:11 crc kubenswrapper[4907]: I1129 14:50:11.741109 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6933-account-create-update-jjhlm" Nov 29 14:50:12 crc kubenswrapper[4907]: I1129 14:50:12.101899 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.370946 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-m4mz2" podUID="33f5965b-43ae-484d-9c5c-1a54ae4de6da" containerName="ovn-controller" probeResult="failure" output=< Nov 29 14:50:13 crc kubenswrapper[4907]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 29 14:50:13 crc kubenswrapper[4907]: > Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.486178 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5da6-account-create-update-7jqbf" event={"ID":"0ab63941-4052-4105-b09e-2bd04a34ed2d","Type":"ContainerDied","Data":"67f416e989b2329ef641c1d8214943ea32099fa5ae26fa4a18bb25d8a5c70b8e"} Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.486226 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67f416e989b2329ef641c1d8214943ea32099fa5ae26fa4a18bb25d8a5c70b8e" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.489561 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b489-account-create-update-ntbwq" event={"ID":"a891cd77-26e1-42f2-bac1-dc68dd51d2d3","Type":"ContainerDied","Data":"78c0821e8ddfc60c4e95721d16a8d84dced7bafecad8686ac46f9be847883adb"} Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.490188 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78c0821e8ddfc60c4e95721d16a8d84dced7bafecad8686ac46f9be847883adb" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.747738 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5da6-account-create-update-7jqbf" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.760340 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b489-account-create-update-ntbwq" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.792587 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a891cd77-26e1-42f2-bac1-dc68dd51d2d3-operator-scripts\") pod \"a891cd77-26e1-42f2-bac1-dc68dd51d2d3\" (UID: \"a891cd77-26e1-42f2-bac1-dc68dd51d2d3\") " Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.792699 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq92j\" (UniqueName: \"kubernetes.io/projected/0ab63941-4052-4105-b09e-2bd04a34ed2d-kube-api-access-vq92j\") pod \"0ab63941-4052-4105-b09e-2bd04a34ed2d\" (UID: \"0ab63941-4052-4105-b09e-2bd04a34ed2d\") " Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.792903 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwzbj\" (UniqueName: \"kubernetes.io/projected/a891cd77-26e1-42f2-bac1-dc68dd51d2d3-kube-api-access-cwzbj\") pod \"a891cd77-26e1-42f2-bac1-dc68dd51d2d3\" (UID: \"a891cd77-26e1-42f2-bac1-dc68dd51d2d3\") " Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.792926 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ab63941-4052-4105-b09e-2bd04a34ed2d-operator-scripts\") pod \"0ab63941-4052-4105-b09e-2bd04a34ed2d\" (UID: \"0ab63941-4052-4105-b09e-2bd04a34ed2d\") " Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.793889 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ab63941-4052-4105-b09e-2bd04a34ed2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ab63941-4052-4105-b09e-2bd04a34ed2d" (UID: "0ab63941-4052-4105-b09e-2bd04a34ed2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.793896 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a891cd77-26e1-42f2-bac1-dc68dd51d2d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a891cd77-26e1-42f2-bac1-dc68dd51d2d3" (UID: "a891cd77-26e1-42f2-bac1-dc68dd51d2d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.807599 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a891cd77-26e1-42f2-bac1-dc68dd51d2d3-kube-api-access-cwzbj" (OuterVolumeSpecName: "kube-api-access-cwzbj") pod "a891cd77-26e1-42f2-bac1-dc68dd51d2d3" (UID: "a891cd77-26e1-42f2-bac1-dc68dd51d2d3"). InnerVolumeSpecName "kube-api-access-cwzbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.808351 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab63941-4052-4105-b09e-2bd04a34ed2d-kube-api-access-vq92j" (OuterVolumeSpecName: "kube-api-access-vq92j") pod "0ab63941-4052-4105-b09e-2bd04a34ed2d" (UID: "0ab63941-4052-4105-b09e-2bd04a34ed2d"). InnerVolumeSpecName "kube-api-access-vq92j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.875300 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-b9xn8" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.889506 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kbxgc" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.895087 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a891cd77-26e1-42f2-bac1-dc68dd51d2d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.895111 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq92j\" (UniqueName: \"kubernetes.io/projected/0ab63941-4052-4105-b09e-2bd04a34ed2d-kube-api-access-vq92j\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.895120 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwzbj\" (UniqueName: \"kubernetes.io/projected/a891cd77-26e1-42f2-bac1-dc68dd51d2d3-kube-api-access-cwzbj\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.895130 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ab63941-4052-4105-b09e-2bd04a34ed2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.966558 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.996284 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-ovsdbserver-nb\") pod \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.996623 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2xnm\" (UniqueName: \"kubernetes.io/projected/c15eb212-5dab-4f9f-bea0-6f3899a36a8b-kube-api-access-k2xnm\") pod \"c15eb212-5dab-4f9f-bea0-6f3899a36a8b\" (UID: \"c15eb212-5dab-4f9f-bea0-6f3899a36a8b\") " Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.997779 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-dns-svc\") pod \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.997916 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c15eb212-5dab-4f9f-bea0-6f3899a36a8b-operator-scripts\") pod \"c15eb212-5dab-4f9f-bea0-6f3899a36a8b\" (UID: \"c15eb212-5dab-4f9f-bea0-6f3899a36a8b\") " Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.998062 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plrh5\" (UniqueName: \"kubernetes.io/projected/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-kube-api-access-plrh5\") pod \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.998176 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-ovsdbserver-sb\") pod \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.999307 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c15eb212-5dab-4f9f-bea0-6f3899a36a8b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c15eb212-5dab-4f9f-bea0-6f3899a36a8b" (UID: "c15eb212-5dab-4f9f-bea0-6f3899a36a8b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.999754 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmxl2\" (UniqueName: \"kubernetes.io/projected/f4a97393-4ffa-49d7-a070-aa2758fe10ed-kube-api-access-pmxl2\") pod \"f4a97393-4ffa-49d7-a070-aa2758fe10ed\" (UID: \"f4a97393-4ffa-49d7-a070-aa2758fe10ed\") " Nov 29 14:50:13 crc kubenswrapper[4907]: I1129 14:50:13.999873 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-config\") pod \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\" (UID: \"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a\") " Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:13.999989 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4a97393-4ffa-49d7-a070-aa2758fe10ed-operator-scripts\") pod \"f4a97393-4ffa-49d7-a070-aa2758fe10ed\" (UID: \"f4a97393-4ffa-49d7-a070-aa2758fe10ed\") " Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.000972 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c15eb212-5dab-4f9f-bea0-6f3899a36a8b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.001616 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a97393-4ffa-49d7-a070-aa2758fe10ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4a97393-4ffa-49d7-a070-aa2758fe10ed" (UID: "f4a97393-4ffa-49d7-a070-aa2758fe10ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.006847 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-kube-api-access-plrh5" (OuterVolumeSpecName: "kube-api-access-plrh5") pod "e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a" (UID: "e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a"). InnerVolumeSpecName "kube-api-access-plrh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.008517 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a97393-4ffa-49d7-a070-aa2758fe10ed-kube-api-access-pmxl2" (OuterVolumeSpecName: "kube-api-access-pmxl2") pod "f4a97393-4ffa-49d7-a070-aa2758fe10ed" (UID: "f4a97393-4ffa-49d7-a070-aa2758fe10ed"). InnerVolumeSpecName "kube-api-access-pmxl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.011386 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c15eb212-5dab-4f9f-bea0-6f3899a36a8b-kube-api-access-k2xnm" (OuterVolumeSpecName: "kube-api-access-k2xnm") pod "c15eb212-5dab-4f9f-bea0-6f3899a36a8b" (UID: "c15eb212-5dab-4f9f-bea0-6f3899a36a8b"). InnerVolumeSpecName "kube-api-access-k2xnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.061258 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a" (UID: "e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.069406 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a" (UID: "e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.070611 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-config" (OuterVolumeSpecName: "config") pod "e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a" (UID: "e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.074161 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a" (UID: "e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.103617 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmxl2\" (UniqueName: \"kubernetes.io/projected/f4a97393-4ffa-49d7-a070-aa2758fe10ed-kube-api-access-pmxl2\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.103669 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.103683 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4a97393-4ffa-49d7-a070-aa2758fe10ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.103695 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.103705 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2xnm\" (UniqueName: \"kubernetes.io/projected/c15eb212-5dab-4f9f-bea0-6f3899a36a8b-kube-api-access-k2xnm\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.103713 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.103724 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plrh5\" (UniqueName: \"kubernetes.io/projected/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-kube-api-access-plrh5\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.103732 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.237933 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-b9b0-account-create-update-gkhzg"] Nov 29 14:50:14 crc kubenswrapper[4907]: W1129 14:50:14.246152 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5040bcc9_5a8d_4a9e_bdc6_b283a9ce828f.slice/crio-ab00e4af247360ea83772c5b07dd9e38129ad619edade7c373cdd9575442df98 WatchSource:0}: Error finding container ab00e4af247360ea83772c5b07dd9e38129ad619edade7c373cdd9575442df98: Status 404 returned error can't find the container with id ab00e4af247360ea83772c5b07dd9e38129ad619edade7c373cdd9575442df98 Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.248658 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6933-account-create-update-jjhlm"] Nov 29 14:50:14 crc kubenswrapper[4907]: W1129 14:50:14.250406 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e46549_a315_4113_a0d2_5aafb96a7f12.slice/crio-59137f781118f78243145b0219c6c51f9d95d1c28a817ac06e21c75ec3fbea90 WatchSource:0}: Error finding container 59137f781118f78243145b0219c6c51f9d95d1c28a817ac06e21c75ec3fbea90: Status 404 returned error can't find the container with id 59137f781118f78243145b0219c6c51f9d95d1c28a817ac06e21c75ec3fbea90 Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.443673 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-k9rqw"] Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.456101 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-lfz28"] Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.520310 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-b9b0-account-create-update-gkhzg" event={"ID":"18e46549-a315-4113-a0d2-5aafb96a7f12","Type":"ContainerStarted","Data":"3077e6a58d6e911bfd1d1aa471d710e78c3a4991eea2bfdfe50a5594c9ef9c64"} Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.520360 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-b9b0-account-create-update-gkhzg" event={"ID":"18e46549-a315-4113-a0d2-5aafb96a7f12","Type":"ContainerStarted","Data":"59137f781118f78243145b0219c6c51f9d95d1c28a817ac06e21c75ec3fbea90"} Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.532610 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nfb2t" event={"ID":"b5bad7a6-9301-4f9e-8303-ae377c4f909f","Type":"ContainerStarted","Data":"8fe826a040a408653166e61cb661c2a1ba5e46eed716e8860a7ad73cb329cb2e"} Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.541072 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" event={"ID":"e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a","Type":"ContainerDied","Data":"4b90530062f4362c6ac6c029361d5c91c740fdbb89b95b63a090ac891c29f1c5"} Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.541135 4907 scope.go:117] "RemoveContainer" containerID="86a936c5d892dcca038a100252784738ca3de880645f6a9565b680176372c593" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.541326 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-s2mxh" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.564074 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-k9rqw" event={"ID":"faad6940-2c08-4b7a-bdc0-b00def56a777","Type":"ContainerStarted","Data":"249cd4217e0af99e163589025be0d16c9152dec2de23a939ab7c4577683c4690"} Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.564197 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-b9b0-account-create-update-gkhzg" podStartSLOduration=6.564174947 podStartE2EDuration="6.564174947s" podCreationTimestamp="2025-11-29 14:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:14.534843841 +0000 UTC m=+1312.521681493" watchObservedRunningTime="2025-11-29 14:50:14.564174947 +0000 UTC m=+1312.551012599" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.566624 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-lfz28" event={"ID":"0dac16ae-ba20-4405-9e83-73dae3db6f5f","Type":"ContainerStarted","Data":"810ca9fce962fe07337d2b19ef94270f9ea53ad412e3f59417cab891e25bcdf4"} Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.569159 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kbxgc" event={"ID":"c15eb212-5dab-4f9f-bea0-6f3899a36a8b","Type":"ContainerDied","Data":"f133ab93107cb02d77a41dcbc5fac8ad49334c02b80f69b7f7a551c3fecf3875"} Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.569196 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f133ab93107cb02d77a41dcbc5fac8ad49334c02b80f69b7f7a551c3fecf3875" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.569274 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kbxgc" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.569660 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-nfb2t" podStartSLOduration=3.199791134 podStartE2EDuration="11.569650223s" podCreationTimestamp="2025-11-29 14:50:03 +0000 UTC" firstStartedPulling="2025-11-29 14:50:05.248209013 +0000 UTC m=+1303.235046675" lastFinishedPulling="2025-11-29 14:50:13.618068092 +0000 UTC m=+1311.604905764" observedRunningTime="2025-11-29 14:50:14.561098039 +0000 UTC m=+1312.547935691" watchObservedRunningTime="2025-11-29 14:50:14.569650223 +0000 UTC m=+1312.556487875" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.575657 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6933-account-create-update-jjhlm" event={"ID":"5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f","Type":"ContainerStarted","Data":"0013934d8d00b3c8ce7cfbb8b75c8c7f675d4f3dbef8ed7c8fdc96c66c4743ff"} Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.575692 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6933-account-create-update-jjhlm" event={"ID":"5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f","Type":"ContainerStarted","Data":"ab00e4af247360ea83772c5b07dd9e38129ad619edade7c373cdd9575442df98"} Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.586615 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5da6-account-create-update-7jqbf" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.586660 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-b9xn8" event={"ID":"f4a97393-4ffa-49d7-a070-aa2758fe10ed","Type":"ContainerDied","Data":"a53c37cbcb72fd2fca29ea68a311d9d231b96522e1459e261748950a642c066d"} Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.586699 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a53c37cbcb72fd2fca29ea68a311d9d231b96522e1459e261748950a642c066d" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.586756 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b489-account-create-update-ntbwq" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.586622 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-b9xn8" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.595574 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-6933-account-create-update-jjhlm" podStartSLOduration=3.595559592 podStartE2EDuration="3.595559592s" podCreationTimestamp="2025-11-29 14:50:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:14.5947999 +0000 UTC m=+1312.581637552" watchObservedRunningTime="2025-11-29 14:50:14.595559592 +0000 UTC m=+1312.582397244" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.654609 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s2mxh"] Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.661079 4907 scope.go:117] "RemoveContainer" containerID="50efbf000a2ac92787a8ab43e6298d71203d3ba3b5a4f188572ff55c0298dd18" Nov 29 14:50:14 crc kubenswrapper[4907]: I1129 14:50:14.662493 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s2mxh"] Nov 29 14:50:15 crc kubenswrapper[4907]: I1129 14:50:15.138718 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:50:15 crc kubenswrapper[4907]: E1129 14:50:15.138940 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 14:50:15 crc kubenswrapper[4907]: E1129 14:50:15.139185 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 14:50:15 crc kubenswrapper[4907]: E1129 14:50:15.139248 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift podName:fe027ad6-8a24-44b5-8bfb-732d5c8fe22a nodeName:}" failed. No retries permitted until 2025-11-29 14:50:31.139230487 +0000 UTC m=+1329.126068139 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift") pod "swift-storage-0" (UID: "fe027ad6-8a24-44b5-8bfb-732d5c8fe22a") : configmap "swift-ring-files" not found Nov 29 14:50:15 crc kubenswrapper[4907]: I1129 14:50:15.600712 4907 generic.go:334] "Generic (PLEG): container finished" podID="18e46549-a315-4113-a0d2-5aafb96a7f12" containerID="3077e6a58d6e911bfd1d1aa471d710e78c3a4991eea2bfdfe50a5594c9ef9c64" exitCode=0 Nov 29 14:50:15 crc kubenswrapper[4907]: I1129 14:50:15.600794 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-b9b0-account-create-update-gkhzg" event={"ID":"18e46549-a315-4113-a0d2-5aafb96a7f12","Type":"ContainerDied","Data":"3077e6a58d6e911bfd1d1aa471d710e78c3a4991eea2bfdfe50a5594c9ef9c64"} Nov 29 14:50:15 crc kubenswrapper[4907]: I1129 14:50:15.613835 4907 generic.go:334] "Generic (PLEG): container finished" podID="faad6940-2c08-4b7a-bdc0-b00def56a777" containerID="47e2b0cbe89453e8ced43cb2bf7f7f8c8cd98fb6d4430f91cc2daf4aae076804" exitCode=0 Nov 29 14:50:15 crc kubenswrapper[4907]: I1129 14:50:15.613925 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-k9rqw" event={"ID":"faad6940-2c08-4b7a-bdc0-b00def56a777","Type":"ContainerDied","Data":"47e2b0cbe89453e8ced43cb2bf7f7f8c8cd98fb6d4430f91cc2daf4aae076804"} Nov 29 14:50:15 crc kubenswrapper[4907]: I1129 14:50:15.617377 4907 generic.go:334] "Generic (PLEG): container finished" podID="0dac16ae-ba20-4405-9e83-73dae3db6f5f" containerID="4e712654f5ddcdfe5720ed2630869def36bb4b2548bd443cc1dd7d00126b9120" exitCode=0 Nov 29 14:50:15 crc kubenswrapper[4907]: I1129 14:50:15.617494 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-lfz28" event={"ID":"0dac16ae-ba20-4405-9e83-73dae3db6f5f","Type":"ContainerDied","Data":"4e712654f5ddcdfe5720ed2630869def36bb4b2548bd443cc1dd7d00126b9120"} Nov 29 14:50:15 crc kubenswrapper[4907]: I1129 14:50:15.623594 4907 generic.go:334] "Generic (PLEG): container finished" podID="5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f" containerID="0013934d8d00b3c8ce7cfbb8b75c8c7f675d4f3dbef8ed7c8fdc96c66c4743ff" exitCode=0 Nov 29 14:50:15 crc kubenswrapper[4907]: I1129 14:50:15.623901 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6933-account-create-update-jjhlm" event={"ID":"5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f","Type":"ContainerDied","Data":"0013934d8d00b3c8ce7cfbb8b75c8c7f675d4f3dbef8ed7c8fdc96c66c4743ff"} Nov 29 14:50:16 crc kubenswrapper[4907]: I1129 14:50:16.492948 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a" path="/var/lib/kubelet/pods/e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a/volumes" Nov 29 14:50:16 crc kubenswrapper[4907]: I1129 14:50:16.636521 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2e724b-ab17-45b3-a5ec-c43bf54e935d","Type":"ContainerStarted","Data":"38ab97381af488a71c6823c824d50f461ea8b6a354c14d29bccee03666031681"} Nov 29 14:50:16 crc kubenswrapper[4907]: I1129 14:50:16.683459 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.281117713 podStartE2EDuration="58.683422887s" podCreationTimestamp="2025-11-29 14:49:18 +0000 UTC" firstStartedPulling="2025-11-29 14:49:31.554908407 +0000 UTC m=+1269.541746059" lastFinishedPulling="2025-11-29 14:50:15.957213581 +0000 UTC m=+1313.944051233" observedRunningTime="2025-11-29 14:50:16.671318512 +0000 UTC m=+1314.658156204" watchObservedRunningTime="2025-11-29 14:50:16.683422887 +0000 UTC m=+1314.670260539" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.141112 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6933-account-create-update-jjhlm" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.184143 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f-operator-scripts\") pod \"5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f\" (UID: \"5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f\") " Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.184293 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xmch\" (UniqueName: \"kubernetes.io/projected/5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f-kube-api-access-9xmch\") pod \"5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f\" (UID: \"5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f\") " Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.184788 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f" (UID: "5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.185210 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.214005 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f-kube-api-access-9xmch" (OuterVolumeSpecName: "kube-api-access-9xmch") pod "5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f" (UID: "5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f"). InnerVolumeSpecName "kube-api-access-9xmch". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.287720 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xmch\" (UniqueName: \"kubernetes.io/projected/5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f-kube-api-access-9xmch\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.350267 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-k9rqw" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.356199 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-lfz28" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.367080 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b9b0-account-create-update-gkhzg" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.390527 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w64w8\" (UniqueName: \"kubernetes.io/projected/0dac16ae-ba20-4405-9e83-73dae3db6f5f-kube-api-access-w64w8\") pod \"0dac16ae-ba20-4405-9e83-73dae3db6f5f\" (UID: \"0dac16ae-ba20-4405-9e83-73dae3db6f5f\") " Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.390638 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faad6940-2c08-4b7a-bdc0-b00def56a777-operator-scripts\") pod \"faad6940-2c08-4b7a-bdc0-b00def56a777\" (UID: \"faad6940-2c08-4b7a-bdc0-b00def56a777\") " Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.390778 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dac16ae-ba20-4405-9e83-73dae3db6f5f-operator-scripts\") pod \"0dac16ae-ba20-4405-9e83-73dae3db6f5f\" (UID: \"0dac16ae-ba20-4405-9e83-73dae3db6f5f\") " Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.390910 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs8pv\" (UniqueName: \"kubernetes.io/projected/faad6940-2c08-4b7a-bdc0-b00def56a777-kube-api-access-gs8pv\") pod \"faad6940-2c08-4b7a-bdc0-b00def56a777\" (UID: \"faad6940-2c08-4b7a-bdc0-b00def56a777\") " Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.394472 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faad6940-2c08-4b7a-bdc0-b00def56a777-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "faad6940-2c08-4b7a-bdc0-b00def56a777" (UID: "faad6940-2c08-4b7a-bdc0-b00def56a777"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.396088 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dac16ae-ba20-4405-9e83-73dae3db6f5f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0dac16ae-ba20-4405-9e83-73dae3db6f5f" (UID: "0dac16ae-ba20-4405-9e83-73dae3db6f5f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.435594 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faad6940-2c08-4b7a-bdc0-b00def56a777-kube-api-access-gs8pv" (OuterVolumeSpecName: "kube-api-access-gs8pv") pod "faad6940-2c08-4b7a-bdc0-b00def56a777" (UID: "faad6940-2c08-4b7a-bdc0-b00def56a777"). InnerVolumeSpecName "kube-api-access-gs8pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.439660 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dac16ae-ba20-4405-9e83-73dae3db6f5f-kube-api-access-w64w8" (OuterVolumeSpecName: "kube-api-access-w64w8") pod "0dac16ae-ba20-4405-9e83-73dae3db6f5f" (UID: "0dac16ae-ba20-4405-9e83-73dae3db6f5f"). InnerVolumeSpecName "kube-api-access-w64w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.493603 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18e46549-a315-4113-a0d2-5aafb96a7f12-operator-scripts\") pod \"18e46549-a315-4113-a0d2-5aafb96a7f12\" (UID: \"18e46549-a315-4113-a0d2-5aafb96a7f12\") " Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.494047 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18e46549-a315-4113-a0d2-5aafb96a7f12-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "18e46549-a315-4113-a0d2-5aafb96a7f12" (UID: "18e46549-a315-4113-a0d2-5aafb96a7f12"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.494065 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbqrd\" (UniqueName: \"kubernetes.io/projected/18e46549-a315-4113-a0d2-5aafb96a7f12-kube-api-access-zbqrd\") pod \"18e46549-a315-4113-a0d2-5aafb96a7f12\" (UID: \"18e46549-a315-4113-a0d2-5aafb96a7f12\") " Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.494954 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18e46549-a315-4113-a0d2-5aafb96a7f12-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.495140 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dac16ae-ba20-4405-9e83-73dae3db6f5f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.495197 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs8pv\" (UniqueName: \"kubernetes.io/projected/faad6940-2c08-4b7a-bdc0-b00def56a777-kube-api-access-gs8pv\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.495247 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w64w8\" (UniqueName: \"kubernetes.io/projected/0dac16ae-ba20-4405-9e83-73dae3db6f5f-kube-api-access-w64w8\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.495294 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faad6940-2c08-4b7a-bdc0-b00def56a777-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.498817 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e46549-a315-4113-a0d2-5aafb96a7f12-kube-api-access-zbqrd" (OuterVolumeSpecName: "kube-api-access-zbqrd") pod "18e46549-a315-4113-a0d2-5aafb96a7f12" (UID: "18e46549-a315-4113-a0d2-5aafb96a7f12"). InnerVolumeSpecName "kube-api-access-zbqrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.597603 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbqrd\" (UniqueName: \"kubernetes.io/projected/18e46549-a315-4113-a0d2-5aafb96a7f12-kube-api-access-zbqrd\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.651564 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-k9rqw" event={"ID":"faad6940-2c08-4b7a-bdc0-b00def56a777","Type":"ContainerDied","Data":"249cd4217e0af99e163589025be0d16c9152dec2de23a939ab7c4577683c4690"} Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.652845 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="249cd4217e0af99e163589025be0d16c9152dec2de23a939ab7c4577683c4690" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.651648 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-k9rqw" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.654279 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-lfz28" event={"ID":"0dac16ae-ba20-4405-9e83-73dae3db6f5f","Type":"ContainerDied","Data":"810ca9fce962fe07337d2b19ef94270f9ea53ad412e3f59417cab891e25bcdf4"} Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.654355 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="810ca9fce962fe07337d2b19ef94270f9ea53ad412e3f59417cab891e25bcdf4" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.654314 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-lfz28" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.657156 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6933-account-create-update-jjhlm" event={"ID":"5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f","Type":"ContainerDied","Data":"ab00e4af247360ea83772c5b07dd9e38129ad619edade7c373cdd9575442df98"} Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.657217 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab00e4af247360ea83772c5b07dd9e38129ad619edade7c373cdd9575442df98" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.657169 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6933-account-create-update-jjhlm" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.659140 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b9b0-account-create-update-gkhzg" Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.659138 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-b9b0-account-create-update-gkhzg" event={"ID":"18e46549-a315-4113-a0d2-5aafb96a7f12","Type":"ContainerDied","Data":"59137f781118f78243145b0219c6c51f9d95d1c28a817ac06e21c75ec3fbea90"} Nov 29 14:50:17 crc kubenswrapper[4907]: I1129 14:50:17.659185 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59137f781118f78243145b0219c6c51f9d95d1c28a817ac06e21c75ec3fbea90" Nov 29 14:50:18 crc kubenswrapper[4907]: I1129 14:50:18.406398 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-m4mz2" podUID="33f5965b-43ae-484d-9c5c-1a54ae4de6da" containerName="ovn-controller" probeResult="failure" output=< Nov 29 14:50:18 crc kubenswrapper[4907]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 29 14:50:18 crc kubenswrapper[4907]: > Nov 29 14:50:19 crc kubenswrapper[4907]: I1129 14:50:19.455235 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:19 crc kubenswrapper[4907]: I1129 14:50:19.456162 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:19 crc kubenswrapper[4907]: I1129 14:50:19.459219 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:19 crc kubenswrapper[4907]: I1129 14:50:19.681260 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:20 crc kubenswrapper[4907]: I1129 14:50:20.689681 4907 generic.go:334] "Generic (PLEG): container finished" podID="b5bad7a6-9301-4f9e-8303-ae377c4f909f" containerID="8fe826a040a408653166e61cb661c2a1ba5e46eed716e8860a7ad73cb329cb2e" exitCode=0 Nov 29 14:50:20 crc kubenswrapper[4907]: I1129 14:50:20.689770 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nfb2t" event={"ID":"b5bad7a6-9301-4f9e-8303-ae377c4f909f","Type":"ContainerDied","Data":"8fe826a040a408653166e61cb661c2a1ba5e46eed716e8860a7ad73cb329cb2e"} Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.624576 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-pwxst"] Nov 29 14:50:21 crc kubenswrapper[4907]: E1129 14:50:21.624928 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a97393-4ffa-49d7-a070-aa2758fe10ed" containerName="mariadb-database-create" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.624944 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a97393-4ffa-49d7-a070-aa2758fe10ed" containerName="mariadb-database-create" Nov 29 14:50:21 crc kubenswrapper[4907]: E1129 14:50:21.624954 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e46549-a315-4113-a0d2-5aafb96a7f12" containerName="mariadb-account-create-update" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.624959 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e46549-a315-4113-a0d2-5aafb96a7f12" containerName="mariadb-account-create-update" Nov 29 14:50:21 crc kubenswrapper[4907]: E1129 14:50:21.624971 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab63941-4052-4105-b09e-2bd04a34ed2d" containerName="mariadb-account-create-update" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.624977 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab63941-4052-4105-b09e-2bd04a34ed2d" containerName="mariadb-account-create-update" Nov 29 14:50:21 crc kubenswrapper[4907]: E1129 14:50:21.624990 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a" containerName="dnsmasq-dns" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.624996 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a" containerName="dnsmasq-dns" Nov 29 14:50:21 crc kubenswrapper[4907]: E1129 14:50:21.625005 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a" containerName="init" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625012 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a" containerName="init" Nov 29 14:50:21 crc kubenswrapper[4907]: E1129 14:50:21.625021 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a891cd77-26e1-42f2-bac1-dc68dd51d2d3" containerName="mariadb-account-create-update" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625027 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a891cd77-26e1-42f2-bac1-dc68dd51d2d3" containerName="mariadb-account-create-update" Nov 29 14:50:21 crc kubenswrapper[4907]: E1129 14:50:21.625039 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dac16ae-ba20-4405-9e83-73dae3db6f5f" containerName="mariadb-database-create" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625045 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dac16ae-ba20-4405-9e83-73dae3db6f5f" containerName="mariadb-database-create" Nov 29 14:50:21 crc kubenswrapper[4907]: E1129 14:50:21.625058 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faad6940-2c08-4b7a-bdc0-b00def56a777" containerName="mariadb-database-create" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625063 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="faad6940-2c08-4b7a-bdc0-b00def56a777" containerName="mariadb-database-create" Nov 29 14:50:21 crc kubenswrapper[4907]: E1129 14:50:21.625075 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f" containerName="mariadb-account-create-update" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625081 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f" containerName="mariadb-account-create-update" Nov 29 14:50:21 crc kubenswrapper[4907]: E1129 14:50:21.625091 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c15eb212-5dab-4f9f-bea0-6f3899a36a8b" containerName="mariadb-database-create" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625098 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c15eb212-5dab-4f9f-bea0-6f3899a36a8b" containerName="mariadb-database-create" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625270 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a891cd77-26e1-42f2-bac1-dc68dd51d2d3" containerName="mariadb-account-create-update" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625281 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a97393-4ffa-49d7-a070-aa2758fe10ed" containerName="mariadb-database-create" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625298 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e46549-a315-4113-a0d2-5aafb96a7f12" containerName="mariadb-account-create-update" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625312 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab63941-4052-4105-b09e-2bd04a34ed2d" containerName="mariadb-account-create-update" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625321 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c5f8be-d271-4c70-8fe3-d52f2ac49d8a" containerName="dnsmasq-dns" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625329 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c15eb212-5dab-4f9f-bea0-6f3899a36a8b" containerName="mariadb-database-create" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625336 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="faad6940-2c08-4b7a-bdc0-b00def56a777" containerName="mariadb-database-create" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625347 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f" containerName="mariadb-account-create-update" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625353 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dac16ae-ba20-4405-9e83-73dae3db6f5f" containerName="mariadb-database-create" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.625969 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pwxst" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.636464 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4qqtq" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.636720 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.643883 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pwxst"] Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.687619 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rntmt\" (UniqueName: \"kubernetes.io/projected/8d940ef0-0877-471d-906a-b6235392867d-kube-api-access-rntmt\") pod \"glance-db-sync-pwxst\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " pod="openstack/glance-db-sync-pwxst" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.687691 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-db-sync-config-data\") pod \"glance-db-sync-pwxst\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " pod="openstack/glance-db-sync-pwxst" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.688046 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-combined-ca-bundle\") pod \"glance-db-sync-pwxst\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " pod="openstack/glance-db-sync-pwxst" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.688246 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-config-data\") pod \"glance-db-sync-pwxst\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " pod="openstack/glance-db-sync-pwxst" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.789661 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-config-data\") pod \"glance-db-sync-pwxst\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " pod="openstack/glance-db-sync-pwxst" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.789969 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rntmt\" (UniqueName: \"kubernetes.io/projected/8d940ef0-0877-471d-906a-b6235392867d-kube-api-access-rntmt\") pod \"glance-db-sync-pwxst\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " pod="openstack/glance-db-sync-pwxst" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.789998 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-db-sync-config-data\") pod \"glance-db-sync-pwxst\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " pod="openstack/glance-db-sync-pwxst" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.790071 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-combined-ca-bundle\") pod \"glance-db-sync-pwxst\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " pod="openstack/glance-db-sync-pwxst" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.795034 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-db-sync-config-data\") pod \"glance-db-sync-pwxst\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " pod="openstack/glance-db-sync-pwxst" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.795396 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-combined-ca-bundle\") pod \"glance-db-sync-pwxst\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " pod="openstack/glance-db-sync-pwxst" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.795496 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-config-data\") pod \"glance-db-sync-pwxst\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " pod="openstack/glance-db-sync-pwxst" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.818894 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rntmt\" (UniqueName: \"kubernetes.io/projected/8d940ef0-0877-471d-906a-b6235392867d-kube-api-access-rntmt\") pod \"glance-db-sync-pwxst\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " pod="openstack/glance-db-sync-pwxst" Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.926364 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 14:50:21 crc kubenswrapper[4907]: I1129 14:50:21.942840 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pwxst" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.111509 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.199455 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v85q\" (UniqueName: \"kubernetes.io/projected/b5bad7a6-9301-4f9e-8303-ae377c4f909f-kube-api-access-4v85q\") pod \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.199523 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-dispersionconf\") pod \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.199768 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-swiftconf\") pod \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.199789 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5bad7a6-9301-4f9e-8303-ae377c4f909f-etc-swift\") pod \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.199832 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5bad7a6-9301-4f9e-8303-ae377c4f909f-scripts\") pod \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.199925 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5bad7a6-9301-4f9e-8303-ae377c4f909f-ring-data-devices\") pod \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.200010 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-combined-ca-bundle\") pod \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\" (UID: \"b5bad7a6-9301-4f9e-8303-ae377c4f909f\") " Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.200561 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5bad7a6-9301-4f9e-8303-ae377c4f909f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b5bad7a6-9301-4f9e-8303-ae377c4f909f" (UID: "b5bad7a6-9301-4f9e-8303-ae377c4f909f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.200673 4907 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5bad7a6-9301-4f9e-8303-ae377c4f909f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.201246 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5bad7a6-9301-4f9e-8303-ae377c4f909f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b5bad7a6-9301-4f9e-8303-ae377c4f909f" (UID: "b5bad7a6-9301-4f9e-8303-ae377c4f909f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.205467 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5bad7a6-9301-4f9e-8303-ae377c4f909f-kube-api-access-4v85q" (OuterVolumeSpecName: "kube-api-access-4v85q") pod "b5bad7a6-9301-4f9e-8303-ae377c4f909f" (UID: "b5bad7a6-9301-4f9e-8303-ae377c4f909f"). InnerVolumeSpecName "kube-api-access-4v85q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.209150 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b5bad7a6-9301-4f9e-8303-ae377c4f909f" (UID: "b5bad7a6-9301-4f9e-8303-ae377c4f909f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.235130 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b5bad7a6-9301-4f9e-8303-ae377c4f909f" (UID: "b5bad7a6-9301-4f9e-8303-ae377c4f909f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.239763 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5bad7a6-9301-4f9e-8303-ae377c4f909f" (UID: "b5bad7a6-9301-4f9e-8303-ae377c4f909f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.250098 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5bad7a6-9301-4f9e-8303-ae377c4f909f-scripts" (OuterVolumeSpecName: "scripts") pod "b5bad7a6-9301-4f9e-8303-ae377c4f909f" (UID: "b5bad7a6-9301-4f9e-8303-ae377c4f909f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.303600 4907 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.303631 4907 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5bad7a6-9301-4f9e-8303-ae377c4f909f-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.303640 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5bad7a6-9301-4f9e-8303-ae377c4f909f-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.303649 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.303660 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v85q\" (UniqueName: \"kubernetes.io/projected/b5bad7a6-9301-4f9e-8303-ae377c4f909f-kube-api-access-4v85q\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.303669 4907 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5bad7a6-9301-4f9e-8303-ae377c4f909f-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.498429 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pwxst"] Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.706699 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pwxst" event={"ID":"8d940ef0-0877-471d-906a-b6235392867d","Type":"ContainerStarted","Data":"6847b8265f02d597825b921ad787455802820d570bb7f68a57727149a8050b35"} Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.708067 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nfb2t" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.708126 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nfb2t" event={"ID":"b5bad7a6-9301-4f9e-8303-ae377c4f909f","Type":"ContainerDied","Data":"e204c0da95eeaedd25ced9e7033f8d1f086de9a44f3410c7c1468fe57ede7599"} Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.708156 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e204c0da95eeaedd25ced9e7033f8d1f086de9a44f3410c7c1468fe57ede7599" Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.708285 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerName="prometheus" containerID="cri-o://71d9ae3ff5f122b40076021f78adaacc594923e8bbd397b5e8e8a878fe116bf5" gracePeriod=600 Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.709537 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerName="thanos-sidecar" containerID="cri-o://38ab97381af488a71c6823c824d50f461ea8b6a354c14d29bccee03666031681" gracePeriod=600 Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.709728 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerName="config-reloader" containerID="cri-o://385b5d09da1d545cd7645ba514e5bb855e37383ef4da8fcf6cf96958c4c84db0" gracePeriod=600 Nov 29 14:50:22 crc kubenswrapper[4907]: I1129 14:50:22.840223 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.207555 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-nlrlx"] Nov 29 14:50:23 crc kubenswrapper[4907]: E1129 14:50:23.208109 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5bad7a6-9301-4f9e-8303-ae377c4f909f" containerName="swift-ring-rebalance" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.208132 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5bad7a6-9301-4f9e-8303-ae377c4f909f" containerName="swift-ring-rebalance" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.208379 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5bad7a6-9301-4f9e-8303-ae377c4f909f" containerName="swift-ring-rebalance" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.209310 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nlrlx" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.216262 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-nlrlx"] Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.233568 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25e7ec61-ebc8-4a53-be29-9243e33b6ca7-operator-scripts\") pod \"heat-db-create-nlrlx\" (UID: \"25e7ec61-ebc8-4a53-be29-9243e33b6ca7\") " pod="openstack/heat-db-create-nlrlx" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.233624 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwkf2\" (UniqueName: \"kubernetes.io/projected/25e7ec61-ebc8-4a53-be29-9243e33b6ca7-kube-api-access-mwkf2\") pod \"heat-db-create-nlrlx\" (UID: \"25e7ec61-ebc8-4a53-be29-9243e33b6ca7\") " pod="openstack/heat-db-create-nlrlx" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.311726 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.327162 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-4ea7-account-create-update-ft8pf"] Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.328648 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4ea7-account-create-update-ft8pf" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.333703 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.335267 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25e7ec61-ebc8-4a53-be29-9243e33b6ca7-operator-scripts\") pod \"heat-db-create-nlrlx\" (UID: \"25e7ec61-ebc8-4a53-be29-9243e33b6ca7\") " pod="openstack/heat-db-create-nlrlx" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.335339 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwkf2\" (UniqueName: \"kubernetes.io/projected/25e7ec61-ebc8-4a53-be29-9243e33b6ca7-kube-api-access-mwkf2\") pod \"heat-db-create-nlrlx\" (UID: \"25e7ec61-ebc8-4a53-be29-9243e33b6ca7\") " pod="openstack/heat-db-create-nlrlx" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.336832 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25e7ec61-ebc8-4a53-be29-9243e33b6ca7-operator-scripts\") pod \"heat-db-create-nlrlx\" (UID: \"25e7ec61-ebc8-4a53-be29-9243e33b6ca7\") " pod="openstack/heat-db-create-nlrlx" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.345508 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4ea7-account-create-update-ft8pf"] Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.356829 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-w9lsr"] Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.371246 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w9lsr" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.379751 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-w9lsr"] Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.411403 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-4mwp5"] Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.412716 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4mwp5" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.421244 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwkf2\" (UniqueName: \"kubernetes.io/projected/25e7ec61-ebc8-4a53-be29-9243e33b6ca7-kube-api-access-mwkf2\") pod \"heat-db-create-nlrlx\" (UID: \"25e7ec61-ebc8-4a53-be29-9243e33b6ca7\") " pod="openstack/heat-db-create-nlrlx" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.440047 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmrjq\" (UniqueName: \"kubernetes.io/projected/70d12fd2-0c0c-435c-863d-0b5445b67460-kube-api-access-bmrjq\") pod \"heat-4ea7-account-create-update-ft8pf\" (UID: \"70d12fd2-0c0c-435c-863d-0b5445b67460\") " pod="openstack/heat-4ea7-account-create-update-ft8pf" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.449777 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4l9s\" (UniqueName: \"kubernetes.io/projected/d6ed5dab-4be9-4c17-934d-75ec8a900d7c-kube-api-access-x4l9s\") pod \"cinder-db-create-w9lsr\" (UID: \"d6ed5dab-4be9-4c17-934d-75ec8a900d7c\") " pod="openstack/cinder-db-create-w9lsr" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.448060 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4mwp5"] Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.449981 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rht4q\" (UniqueName: \"kubernetes.io/projected/4d0cf369-d1c2-495e-82d9-31d8f75b3538-kube-api-access-rht4q\") pod \"barbican-db-create-4mwp5\" (UID: \"4d0cf369-d1c2-495e-82d9-31d8f75b3538\") " pod="openstack/barbican-db-create-4mwp5" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.450373 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d0cf369-d1c2-495e-82d9-31d8f75b3538-operator-scripts\") pod \"barbican-db-create-4mwp5\" (UID: \"4d0cf369-d1c2-495e-82d9-31d8f75b3538\") " pod="openstack/barbican-db-create-4mwp5" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.450474 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70d12fd2-0c0c-435c-863d-0b5445b67460-operator-scripts\") pod \"heat-4ea7-account-create-update-ft8pf\" (UID: \"70d12fd2-0c0c-435c-863d-0b5445b67460\") " pod="openstack/heat-4ea7-account-create-update-ft8pf" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.450675 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ed5dab-4be9-4c17-934d-75ec8a900d7c-operator-scripts\") pod \"cinder-db-create-w9lsr\" (UID: \"d6ed5dab-4be9-4c17-934d-75ec8a900d7c\") " pod="openstack/cinder-db-create-w9lsr" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.499555 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-m4mz2" podUID="33f5965b-43ae-484d-9c5c-1a54ae4de6da" containerName="ovn-controller" probeResult="failure" output=< Nov 29 14:50:23 crc kubenswrapper[4907]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 29 14:50:23 crc kubenswrapper[4907]: > Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.499602 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-bf9d-account-create-update-q2tvd"] Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.508239 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bf9d-account-create-update-q2tvd" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.520660 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.520755 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.532146 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ndrfl" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.552260 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndsqf\" (UniqueName: \"kubernetes.io/projected/9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331-kube-api-access-ndsqf\") pod \"barbican-bf9d-account-create-update-q2tvd\" (UID: \"9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331\") " pod="openstack/barbican-bf9d-account-create-update-q2tvd" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.559701 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d0cf369-d1c2-495e-82d9-31d8f75b3538-operator-scripts\") pod \"barbican-db-create-4mwp5\" (UID: \"4d0cf369-d1c2-495e-82d9-31d8f75b3538\") " pod="openstack/barbican-db-create-4mwp5" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.559756 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70d12fd2-0c0c-435c-863d-0b5445b67460-operator-scripts\") pod \"heat-4ea7-account-create-update-ft8pf\" (UID: \"70d12fd2-0c0c-435c-863d-0b5445b67460\") " pod="openstack/heat-4ea7-account-create-update-ft8pf" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.559946 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331-operator-scripts\") pod \"barbican-bf9d-account-create-update-q2tvd\" (UID: \"9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331\") " pod="openstack/barbican-bf9d-account-create-update-q2tvd" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.559979 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ed5dab-4be9-4c17-934d-75ec8a900d7c-operator-scripts\") pod \"cinder-db-create-w9lsr\" (UID: \"d6ed5dab-4be9-4c17-934d-75ec8a900d7c\") " pod="openstack/cinder-db-create-w9lsr" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.560070 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmrjq\" (UniqueName: \"kubernetes.io/projected/70d12fd2-0c0c-435c-863d-0b5445b67460-kube-api-access-bmrjq\") pod \"heat-4ea7-account-create-update-ft8pf\" (UID: \"70d12fd2-0c0c-435c-863d-0b5445b67460\") " pod="openstack/heat-4ea7-account-create-update-ft8pf" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.560115 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4l9s\" (UniqueName: \"kubernetes.io/projected/d6ed5dab-4be9-4c17-934d-75ec8a900d7c-kube-api-access-x4l9s\") pod \"cinder-db-create-w9lsr\" (UID: \"d6ed5dab-4be9-4c17-934d-75ec8a900d7c\") " pod="openstack/cinder-db-create-w9lsr" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.560163 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rht4q\" (UniqueName: \"kubernetes.io/projected/4d0cf369-d1c2-495e-82d9-31d8f75b3538-kube-api-access-rht4q\") pod \"barbican-db-create-4mwp5\" (UID: \"4d0cf369-d1c2-495e-82d9-31d8f75b3538\") " pod="openstack/barbican-db-create-4mwp5" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.562997 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nlrlx" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.564293 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70d12fd2-0c0c-435c-863d-0b5445b67460-operator-scripts\") pod \"heat-4ea7-account-create-update-ft8pf\" (UID: \"70d12fd2-0c0c-435c-863d-0b5445b67460\") " pod="openstack/heat-4ea7-account-create-update-ft8pf" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.564770 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d0cf369-d1c2-495e-82d9-31d8f75b3538-operator-scripts\") pod \"barbican-db-create-4mwp5\" (UID: \"4d0cf369-d1c2-495e-82d9-31d8f75b3538\") " pod="openstack/barbican-db-create-4mwp5" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.570493 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bf9d-account-create-update-q2tvd"] Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.571100 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ed5dab-4be9-4c17-934d-75ec8a900d7c-operator-scripts\") pod \"cinder-db-create-w9lsr\" (UID: \"d6ed5dab-4be9-4c17-934d-75ec8a900d7c\") " pod="openstack/cinder-db-create-w9lsr" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.587779 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-prtgl"] Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.589119 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-prtgl" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.612048 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vcrpv" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.612112 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rht4q\" (UniqueName: \"kubernetes.io/projected/4d0cf369-d1c2-495e-82d9-31d8f75b3538-kube-api-access-rht4q\") pod \"barbican-db-create-4mwp5\" (UID: \"4d0cf369-d1c2-495e-82d9-31d8f75b3538\") " pod="openstack/barbican-db-create-4mwp5" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.612228 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.612484 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.612651 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.617536 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4l9s\" (UniqueName: \"kubernetes.io/projected/d6ed5dab-4be9-4c17-934d-75ec8a900d7c-kube-api-access-x4l9s\") pod \"cinder-db-create-w9lsr\" (UID: \"d6ed5dab-4be9-4c17-934d-75ec8a900d7c\") " pod="openstack/cinder-db-create-w9lsr" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.617588 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmrjq\" (UniqueName: \"kubernetes.io/projected/70d12fd2-0c0c-435c-863d-0b5445b67460-kube-api-access-bmrjq\") pod \"heat-4ea7-account-create-update-ft8pf\" (UID: \"70d12fd2-0c0c-435c-863d-0b5445b67460\") " pod="openstack/heat-4ea7-account-create-update-ft8pf" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.622090 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-prtgl"] Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.662864 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4ea7-account-create-update-ft8pf" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.664006 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndsqf\" (UniqueName: \"kubernetes.io/projected/9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331-kube-api-access-ndsqf\") pod \"barbican-bf9d-account-create-update-q2tvd\" (UID: \"9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331\") " pod="openstack/barbican-bf9d-account-create-update-q2tvd" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.664039 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwxdm\" (UniqueName: \"kubernetes.io/projected/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-kube-api-access-rwxdm\") pod \"keystone-db-sync-prtgl\" (UID: \"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3\") " pod="openstack/keystone-db-sync-prtgl" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.664115 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331-operator-scripts\") pod \"barbican-bf9d-account-create-update-q2tvd\" (UID: \"9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331\") " pod="openstack/barbican-bf9d-account-create-update-q2tvd" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.664203 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-combined-ca-bundle\") pod \"keystone-db-sync-prtgl\" (UID: \"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3\") " pod="openstack/keystone-db-sync-prtgl" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.664247 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-config-data\") pod \"keystone-db-sync-prtgl\" (UID: \"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3\") " pod="openstack/keystone-db-sync-prtgl" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.665135 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331-operator-scripts\") pod \"barbican-bf9d-account-create-update-q2tvd\" (UID: \"9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331\") " pod="openstack/barbican-bf9d-account-create-update-q2tvd" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.699067 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndsqf\" (UniqueName: \"kubernetes.io/projected/9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331-kube-api-access-ndsqf\") pod \"barbican-bf9d-account-create-update-q2tvd\" (UID: \"9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331\") " pod="openstack/barbican-bf9d-account-create-update-q2tvd" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.764989 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2bed-account-create-update-kkzsn"] Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.767027 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2bed-account-create-update-kkzsn" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.769175 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-combined-ca-bundle\") pod \"keystone-db-sync-prtgl\" (UID: \"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3\") " pod="openstack/keystone-db-sync-prtgl" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.769241 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-config-data\") pod \"keystone-db-sync-prtgl\" (UID: \"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3\") " pod="openstack/keystone-db-sync-prtgl" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.769273 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwxdm\" (UniqueName: \"kubernetes.io/projected/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-kube-api-access-rwxdm\") pod \"keystone-db-sync-prtgl\" (UID: \"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3\") " pod="openstack/keystone-db-sync-prtgl" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.772805 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.773025 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-combined-ca-bundle\") pod \"keystone-db-sync-prtgl\" (UID: \"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3\") " pod="openstack/keystone-db-sync-prtgl" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.777816 4907 generic.go:334] "Generic (PLEG): container finished" podID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerID="38ab97381af488a71c6823c824d50f461ea8b6a354c14d29bccee03666031681" exitCode=0 Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.777860 4907 generic.go:334] "Generic (PLEG): container finished" podID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerID="385b5d09da1d545cd7645ba514e5bb855e37383ef4da8fcf6cf96958c4c84db0" exitCode=0 Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.777869 4907 generic.go:334] "Generic (PLEG): container finished" podID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerID="71d9ae3ff5f122b40076021f78adaacc594923e8bbd397b5e8e8a878fe116bf5" exitCode=0 Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.777863 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2e724b-ab17-45b3-a5ec-c43bf54e935d","Type":"ContainerDied","Data":"38ab97381af488a71c6823c824d50f461ea8b6a354c14d29bccee03666031681"} Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.777925 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2e724b-ab17-45b3-a5ec-c43bf54e935d","Type":"ContainerDied","Data":"385b5d09da1d545cd7645ba514e5bb855e37383ef4da8fcf6cf96958c4c84db0"} Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.777938 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2e724b-ab17-45b3-a5ec-c43bf54e935d","Type":"ContainerDied","Data":"71d9ae3ff5f122b40076021f78adaacc594923e8bbd397b5e8e8a878fe116bf5"} Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.782787 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-config-data\") pod \"keystone-db-sync-prtgl\" (UID: \"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3\") " pod="openstack/keystone-db-sync-prtgl" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.825945 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w9lsr" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.826887 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwxdm\" (UniqueName: \"kubernetes.io/projected/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-kube-api-access-rwxdm\") pod \"keystone-db-sync-prtgl\" (UID: \"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3\") " pod="openstack/keystone-db-sync-prtgl" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.849035 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4mwp5" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.865577 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bf9d-account-create-update-q2tvd" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.877641 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pmzg\" (UniqueName: \"kubernetes.io/projected/6da37e82-34e6-45a1-a1b8-a373467376a9-kube-api-access-4pmzg\") pod \"cinder-2bed-account-create-update-kkzsn\" (UID: \"6da37e82-34e6-45a1-a1b8-a373467376a9\") " pod="openstack/cinder-2bed-account-create-update-kkzsn" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.877700 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6da37e82-34e6-45a1-a1b8-a373467376a9-operator-scripts\") pod \"cinder-2bed-account-create-update-kkzsn\" (UID: \"6da37e82-34e6-45a1-a1b8-a373467376a9\") " pod="openstack/cinder-2bed-account-create-update-kkzsn" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.894187 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2bed-account-create-update-kkzsn"] Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.924547 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-dsvvn"] Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.925894 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dsvvn" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.940397 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dsvvn"] Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.980945 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1359b44b-6aa8-48f7-98e0-faea12df3d79-operator-scripts\") pod \"neutron-db-create-dsvvn\" (UID: \"1359b44b-6aa8-48f7-98e0-faea12df3d79\") " pod="openstack/neutron-db-create-dsvvn" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.981049 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45vpj\" (UniqueName: \"kubernetes.io/projected/1359b44b-6aa8-48f7-98e0-faea12df3d79-kube-api-access-45vpj\") pod \"neutron-db-create-dsvvn\" (UID: \"1359b44b-6aa8-48f7-98e0-faea12df3d79\") " pod="openstack/neutron-db-create-dsvvn" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.981076 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pmzg\" (UniqueName: \"kubernetes.io/projected/6da37e82-34e6-45a1-a1b8-a373467376a9-kube-api-access-4pmzg\") pod \"cinder-2bed-account-create-update-kkzsn\" (UID: \"6da37e82-34e6-45a1-a1b8-a373467376a9\") " pod="openstack/cinder-2bed-account-create-update-kkzsn" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.981105 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6da37e82-34e6-45a1-a1b8-a373467376a9-operator-scripts\") pod \"cinder-2bed-account-create-update-kkzsn\" (UID: \"6da37e82-34e6-45a1-a1b8-a373467376a9\") " pod="openstack/cinder-2bed-account-create-update-kkzsn" Nov 29 14:50:23 crc kubenswrapper[4907]: I1129 14:50:23.981804 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6da37e82-34e6-45a1-a1b8-a373467376a9-operator-scripts\") pod \"cinder-2bed-account-create-update-kkzsn\" (UID: \"6da37e82-34e6-45a1-a1b8-a373467376a9\") " pod="openstack/cinder-2bed-account-create-update-kkzsn" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.019917 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pmzg\" (UniqueName: \"kubernetes.io/projected/6da37e82-34e6-45a1-a1b8-a373467376a9-kube-api-access-4pmzg\") pod \"cinder-2bed-account-create-update-kkzsn\" (UID: \"6da37e82-34e6-45a1-a1b8-a373467376a9\") " pod="openstack/cinder-2bed-account-create-update-kkzsn" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.052330 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7e4b-account-create-update-k4vjz"] Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.053881 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7e4b-account-create-update-k4vjz" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.054326 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-prtgl" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.058651 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.086925 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c1c8d94-4154-402a-858d-fd819787ba8e-operator-scripts\") pod \"neutron-7e4b-account-create-update-k4vjz\" (UID: \"6c1c8d94-4154-402a-858d-fd819787ba8e\") " pod="openstack/neutron-7e4b-account-create-update-k4vjz" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.087231 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1359b44b-6aa8-48f7-98e0-faea12df3d79-operator-scripts\") pod \"neutron-db-create-dsvvn\" (UID: \"1359b44b-6aa8-48f7-98e0-faea12df3d79\") " pod="openstack/neutron-db-create-dsvvn" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.087313 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45vpj\" (UniqueName: \"kubernetes.io/projected/1359b44b-6aa8-48f7-98e0-faea12df3d79-kube-api-access-45vpj\") pod \"neutron-db-create-dsvvn\" (UID: \"1359b44b-6aa8-48f7-98e0-faea12df3d79\") " pod="openstack/neutron-db-create-dsvvn" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.087373 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjvk2\" (UniqueName: \"kubernetes.io/projected/6c1c8d94-4154-402a-858d-fd819787ba8e-kube-api-access-mjvk2\") pod \"neutron-7e4b-account-create-update-k4vjz\" (UID: \"6c1c8d94-4154-402a-858d-fd819787ba8e\") " pod="openstack/neutron-7e4b-account-create-update-k4vjz" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.088182 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1359b44b-6aa8-48f7-98e0-faea12df3d79-operator-scripts\") pod \"neutron-db-create-dsvvn\" (UID: \"1359b44b-6aa8-48f7-98e0-faea12df3d79\") " pod="openstack/neutron-db-create-dsvvn" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.120983 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2bed-account-create-update-kkzsn" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.148711 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45vpj\" (UniqueName: \"kubernetes.io/projected/1359b44b-6aa8-48f7-98e0-faea12df3d79-kube-api-access-45vpj\") pod \"neutron-db-create-dsvvn\" (UID: \"1359b44b-6aa8-48f7-98e0-faea12df3d79\") " pod="openstack/neutron-db-create-dsvvn" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.158007 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m4mz2-config-jqq9r"] Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.159345 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.166195 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.184152 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7e4b-account-create-update-k4vjz"] Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.204972 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdfdc\" (UniqueName: \"kubernetes.io/projected/8094ae6e-4447-47a9-8e27-8d59856f6891-kube-api-access-wdfdc\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.205100 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-run-ovn\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.205180 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjvk2\" (UniqueName: \"kubernetes.io/projected/6c1c8d94-4154-402a-858d-fd819787ba8e-kube-api-access-mjvk2\") pod \"neutron-7e4b-account-create-update-k4vjz\" (UID: \"6c1c8d94-4154-402a-858d-fd819787ba8e\") " pod="openstack/neutron-7e4b-account-create-update-k4vjz" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.205206 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8094ae6e-4447-47a9-8e27-8d59856f6891-scripts\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.205226 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-run\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.205247 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-log-ovn\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.205277 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c1c8d94-4154-402a-858d-fd819787ba8e-operator-scripts\") pod \"neutron-7e4b-account-create-update-k4vjz\" (UID: \"6c1c8d94-4154-402a-858d-fd819787ba8e\") " pod="openstack/neutron-7e4b-account-create-update-k4vjz" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.205310 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8094ae6e-4447-47a9-8e27-8d59856f6891-additional-scripts\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.206759 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c1c8d94-4154-402a-858d-fd819787ba8e-operator-scripts\") pod \"neutron-7e4b-account-create-update-k4vjz\" (UID: \"6c1c8d94-4154-402a-858d-fd819787ba8e\") " pod="openstack/neutron-7e4b-account-create-update-k4vjz" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.253512 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m4mz2-config-jqq9r"] Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.298502 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv"] Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.310168 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8094ae6e-4447-47a9-8e27-8d59856f6891-scripts\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.310248 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-run\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.310294 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-log-ovn\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.310361 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8094ae6e-4447-47a9-8e27-8d59856f6891-additional-scripts\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.310418 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdfdc\" (UniqueName: \"kubernetes.io/projected/8094ae6e-4447-47a9-8e27-8d59856f6891-kube-api-access-wdfdc\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.310513 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-run-ovn\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.310850 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-run-ovn\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.313124 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-log-ovn\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.313544 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8094ae6e-4447-47a9-8e27-8d59856f6891-additional-scripts\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.313544 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-run\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.315717 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.318238 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8094ae6e-4447-47a9-8e27-8d59856f6891-scripts\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.360579 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv"] Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.385566 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdfdc\" (UniqueName: \"kubernetes.io/projected/8094ae6e-4447-47a9-8e27-8d59856f6891-kube-api-access-wdfdc\") pod \"ovn-controller-m4mz2-config-jqq9r\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.385688 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjvk2\" (UniqueName: \"kubernetes.io/projected/6c1c8d94-4154-402a-858d-fd819787ba8e-kube-api-access-mjvk2\") pod \"neutron-7e4b-account-create-update-k4vjz\" (UID: \"6c1c8d94-4154-402a-858d-fd819787ba8e\") " pod="openstack/neutron-7e4b-account-create-update-k4vjz" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.391368 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.411877 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nsvf\" (UniqueName: \"kubernetes.io/projected/083d8a82-cfe8-4bd9-b612-9466ca400e16-kube-api-access-4nsvf\") pod \"mysqld-exporter-openstack-cell1-db-create-5dzcv\" (UID: \"083d8a82-cfe8-4bd9-b612-9466ca400e16\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.411960 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/083d8a82-cfe8-4bd9-b612-9466ca400e16-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-5dzcv\" (UID: \"083d8a82-cfe8-4bd9-b612-9466ca400e16\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.412128 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dsvvn" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.467405 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7e4b-account-create-update-k4vjz" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.517638 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nsvf\" (UniqueName: \"kubernetes.io/projected/083d8a82-cfe8-4bd9-b612-9466ca400e16-kube-api-access-4nsvf\") pod \"mysqld-exporter-openstack-cell1-db-create-5dzcv\" (UID: \"083d8a82-cfe8-4bd9-b612-9466ca400e16\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.517742 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/083d8a82-cfe8-4bd9-b612-9466ca400e16-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-5dzcv\" (UID: \"083d8a82-cfe8-4bd9-b612-9466ca400e16\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.518948 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/083d8a82-cfe8-4bd9-b612-9466ca400e16-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-5dzcv\" (UID: \"083d8a82-cfe8-4bd9-b612-9466ca400e16\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.551972 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nsvf\" (UniqueName: \"kubernetes.io/projected/083d8a82-cfe8-4bd9-b612-9466ca400e16-kube-api-access-4nsvf\") pod \"mysqld-exporter-openstack-cell1-db-create-5dzcv\" (UID: \"083d8a82-cfe8-4bd9-b612-9466ca400e16\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.562767 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-72c9-account-create-update-27v75"] Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.563992 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-72c9-account-create-update-27v75" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.576070 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.581082 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-72c9-account-create-update-27v75"] Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.622835 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/521498fd-fd08-4dc4-ba76-3a92a99cc7a1-operator-scripts\") pod \"mysqld-exporter-72c9-account-create-update-27v75\" (UID: \"521498fd-fd08-4dc4-ba76-3a92a99cc7a1\") " pod="openstack/mysqld-exporter-72c9-account-create-update-27v75" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.622908 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxzw2\" (UniqueName: \"kubernetes.io/projected/521498fd-fd08-4dc4-ba76-3a92a99cc7a1-kube-api-access-fxzw2\") pod \"mysqld-exporter-72c9-account-create-update-27v75\" (UID: \"521498fd-fd08-4dc4-ba76-3a92a99cc7a1\") " pod="openstack/mysqld-exporter-72c9-account-create-update-27v75" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.705607 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.728049 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/521498fd-fd08-4dc4-ba76-3a92a99cc7a1-operator-scripts\") pod \"mysqld-exporter-72c9-account-create-update-27v75\" (UID: \"521498fd-fd08-4dc4-ba76-3a92a99cc7a1\") " pod="openstack/mysqld-exporter-72c9-account-create-update-27v75" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.728131 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxzw2\" (UniqueName: \"kubernetes.io/projected/521498fd-fd08-4dc4-ba76-3a92a99cc7a1-kube-api-access-fxzw2\") pod \"mysqld-exporter-72c9-account-create-update-27v75\" (UID: \"521498fd-fd08-4dc4-ba76-3a92a99cc7a1\") " pod="openstack/mysqld-exporter-72c9-account-create-update-27v75" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.750991 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/521498fd-fd08-4dc4-ba76-3a92a99cc7a1-operator-scripts\") pod \"mysqld-exporter-72c9-account-create-update-27v75\" (UID: \"521498fd-fd08-4dc4-ba76-3a92a99cc7a1\") " pod="openstack/mysqld-exporter-72c9-account-create-update-27v75" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.774246 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxzw2\" (UniqueName: \"kubernetes.io/projected/521498fd-fd08-4dc4-ba76-3a92a99cc7a1-kube-api-access-fxzw2\") pod \"mysqld-exporter-72c9-account-create-update-27v75\" (UID: \"521498fd-fd08-4dc4-ba76-3a92a99cc7a1\") " pod="openstack/mysqld-exporter-72c9-account-create-update-27v75" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.791243 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2e724b-ab17-45b3-a5ec-c43bf54e935d","Type":"ContainerDied","Data":"cb0ace32345c48af82b8ee4bfd9550cf5176ba5b3881e7027032b8abd3746ba0"} Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.791292 4907 scope.go:117] "RemoveContainer" containerID="38ab97381af488a71c6823c824d50f461ea8b6a354c14d29bccee03666031681" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.791300 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.829277 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-config-out\") pod \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.829386 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-prometheus-metric-storage-rulefiles-0\") pod \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.829409 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.829521 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-config\") pod \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.829558 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-thanos-prometheus-http-client-file\") pod \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.829600 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-tls-assets\") pod \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.829636 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5jqt\" (UniqueName: \"kubernetes.io/projected/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-kube-api-access-m5jqt\") pod \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.829664 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-web-config\") pod \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\" (UID: \"1d2e724b-ab17-45b3-a5ec-c43bf54e935d\") " Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.834329 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-config" (OuterVolumeSpecName: "config") pod "1d2e724b-ab17-45b3-a5ec-c43bf54e935d" (UID: "1d2e724b-ab17-45b3-a5ec-c43bf54e935d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.834943 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "1d2e724b-ab17-45b3-a5ec-c43bf54e935d" (UID: "1d2e724b-ab17-45b3-a5ec-c43bf54e935d"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.854448 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.856473 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1d2e724b-ab17-45b3-a5ec-c43bf54e935d" (UID: "1d2e724b-ab17-45b3-a5ec-c43bf54e935d"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.856565 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1d2e724b-ab17-45b3-a5ec-c43bf54e935d" (UID: "1d2e724b-ab17-45b3-a5ec-c43bf54e935d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.857030 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-kube-api-access-m5jqt" (OuterVolumeSpecName: "kube-api-access-m5jqt") pod "1d2e724b-ab17-45b3-a5ec-c43bf54e935d" (UID: "1d2e724b-ab17-45b3-a5ec-c43bf54e935d"). InnerVolumeSpecName "kube-api-access-m5jqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.865423 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "1d2e724b-ab17-45b3-a5ec-c43bf54e935d" (UID: "1d2e724b-ab17-45b3-a5ec-c43bf54e935d"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.870733 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-config-out" (OuterVolumeSpecName: "config-out") pod "1d2e724b-ab17-45b3-a5ec-c43bf54e935d" (UID: "1d2e724b-ab17-45b3-a5ec-c43bf54e935d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.929833 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-72c9-account-create-update-27v75" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.934822 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5jqt\" (UniqueName: \"kubernetes.io/projected/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-kube-api-access-m5jqt\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.934843 4907 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-config-out\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.934854 4907 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.934885 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.934896 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.934907 4907 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.934916 4907 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-tls-assets\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:24 crc kubenswrapper[4907]: I1129 14:50:24.969804 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:24.994296 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-web-config" (OuterVolumeSpecName: "web-config") pod "1d2e724b-ab17-45b3-a5ec-c43bf54e935d" (UID: "1d2e724b-ab17-45b3-a5ec-c43bf54e935d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.043929 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.043973 4907 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d2e724b-ab17-45b3-a5ec-c43bf54e935d-web-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.046615 4907 scope.go:117] "RemoveContainer" containerID="385b5d09da1d545cd7645ba514e5bb855e37383ef4da8fcf6cf96958c4c84db0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.123609 4907 scope.go:117] "RemoveContainer" containerID="71d9ae3ff5f122b40076021f78adaacc594923e8bbd397b5e8e8a878fe116bf5" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.161974 4907 scope.go:117] "RemoveContainer" containerID="aa38b9a165cea5bc8a9d35d4608f4013dc3e0b908f032f057df0bb2cf95050b9" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.214279 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-nlrlx"] Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.223818 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.232365 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 14:50:25 crc kubenswrapper[4907]: E1129 14:50:25.253061 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice/crio-e204c0da95eeaedd25ced9e7033f8d1f086de9a44f3410c7c1468fe57ede7599\": RecentStats: unable to find data in memory cache]" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.316136 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 14:50:25 crc kubenswrapper[4907]: E1129 14:50:25.316897 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerName="init-config-reloader" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.316916 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerName="init-config-reloader" Nov 29 14:50:25 crc kubenswrapper[4907]: E1129 14:50:25.316944 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerName="config-reloader" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.316955 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerName="config-reloader" Nov 29 14:50:25 crc kubenswrapper[4907]: E1129 14:50:25.316966 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerName="prometheus" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.316971 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerName="prometheus" Nov 29 14:50:25 crc kubenswrapper[4907]: E1129 14:50:25.316982 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerName="thanos-sidecar" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.316988 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerName="thanos-sidecar" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.317213 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerName="thanos-sidecar" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.317224 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerName="config-reloader" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.317238 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerName="prometheus" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.319386 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.325244 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.325420 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.325663 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.325844 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.325959 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-psfxj" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.326331 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.335836 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.342771 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 14:50:25 crc kubenswrapper[4907]: E1129 14:50:25.419370 4907 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.47:46746->38.102.83.47:43783: write tcp 38.102.83.47:46746->38.102.83.47:43783: write: broken pipe Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.475032 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd2m5\" (UniqueName: \"kubernetes.io/projected/b794f06f-38a0-4c4d-933b-db50f05ddfb8-kube-api-access-pd2m5\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.475290 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.475316 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.475356 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.475453 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.475496 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b794f06f-38a0-4c4d-933b-db50f05ddfb8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.475535 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.475611 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.475631 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b794f06f-38a0-4c4d-933b-db50f05ddfb8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.475782 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-config\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.475872 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b794f06f-38a0-4c4d-933b-db50f05ddfb8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.577722 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.577800 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b794f06f-38a0-4c4d-933b-db50f05ddfb8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.577823 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.577902 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.577922 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b794f06f-38a0-4c4d-933b-db50f05ddfb8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.577973 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-config\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.578001 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b794f06f-38a0-4c4d-933b-db50f05ddfb8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.578023 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd2m5\" (UniqueName: \"kubernetes.io/projected/b794f06f-38a0-4c4d-933b-db50f05ddfb8-kube-api-access-pd2m5\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.578041 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.578059 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.578080 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.578833 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.580754 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b794f06f-38a0-4c4d-933b-db50f05ddfb8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.587412 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-config\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.587806 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b794f06f-38a0-4c4d-933b-db50f05ddfb8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.588308 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b794f06f-38a0-4c4d-933b-db50f05ddfb8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.588364 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.588911 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.590728 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.591334 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.593595 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b794f06f-38a0-4c4d-933b-db50f05ddfb8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.613983 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd2m5\" (UniqueName: \"kubernetes.io/projected/b794f06f-38a0-4c4d-933b-db50f05ddfb8-kube-api-access-pd2m5\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.621218 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"b794f06f-38a0-4c4d-933b-db50f05ddfb8\") " pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.652560 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.806744 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m4mz2-config-jqq9r"] Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.810133 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nlrlx" event={"ID":"25e7ec61-ebc8-4a53-be29-9243e33b6ca7","Type":"ContainerDied","Data":"e386dac190bea718b2ee91a443059e406fd22311ec85ee3b29866534fa051775"} Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.807040 4907 generic.go:334] "Generic (PLEG): container finished" podID="25e7ec61-ebc8-4a53-be29-9243e33b6ca7" containerID="e386dac190bea718b2ee91a443059e406fd22311ec85ee3b29866534fa051775" exitCode=0 Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.810272 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nlrlx" event={"ID":"25e7ec61-ebc8-4a53-be29-9243e33b6ca7","Type":"ContainerStarted","Data":"14e64a060995506b0b2d86a93dc0bb754cfde596c7e103e80fea2fcedc43ee52"} Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.834502 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-prtgl"] Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.902548 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4mwp5"] Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.930885 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-w9lsr"] Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.939079 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bf9d-account-create-update-q2tvd"] Nov 29 14:50:25 crc kubenswrapper[4907]: W1129 14:50:25.957484 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6ed5dab_4be9_4c17_934d_75ec8a900d7c.slice/crio-9c5c8ff3f38c777d76bed62b206657272063109597ee2663ece08e4e17c05356 WatchSource:0}: Error finding container 9c5c8ff3f38c777d76bed62b206657272063109597ee2663ece08e4e17c05356: Status 404 returned error can't find the container with id 9c5c8ff3f38c777d76bed62b206657272063109597ee2663ece08e4e17c05356 Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.959414 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2bed-account-create-update-kkzsn"] Nov 29 14:50:25 crc kubenswrapper[4907]: I1129 14:50:25.972196 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4ea7-account-create-update-ft8pf"] Nov 29 14:50:26 crc kubenswrapper[4907]: W1129 14:50:26.020407 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70d12fd2_0c0c_435c_863d_0b5445b67460.slice/crio-2f323283a12379d9ad16db99d200f0204fcc3773dff4ea67424ce12f553b510f WatchSource:0}: Error finding container 2f323283a12379d9ad16db99d200f0204fcc3773dff4ea67424ce12f553b510f: Status 404 returned error can't find the container with id 2f323283a12379d9ad16db99d200f0204fcc3773dff4ea67424ce12f553b510f Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.117693 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dsvvn"] Nov 29 14:50:26 crc kubenswrapper[4907]: W1129 14:50:26.171948 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1359b44b_6aa8_48f7_98e0_faea12df3d79.slice/crio-b6d3170930287c694320c20220796aa89c4bb5baa2144c22209ac12a4d50b59b WatchSource:0}: Error finding container b6d3170930287c694320c20220796aa89c4bb5baa2144c22209ac12a4d50b59b: Status 404 returned error can't find the container with id b6d3170930287c694320c20220796aa89c4bb5baa2144c22209ac12a4d50b59b Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.189914 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7e4b-account-create-update-k4vjz"] Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.202213 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-72c9-account-create-update-27v75"] Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.235042 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv"] Nov 29 14:50:26 crc kubenswrapper[4907]: W1129 14:50:26.255722 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c1c8d94_4154_402a_858d_fd819787ba8e.slice/crio-9bc72d2f2a1df53748c4cc97a5ce7b7ef5669f95cc9208c4105c21fd8401fea5 WatchSource:0}: Error finding container 9bc72d2f2a1df53748c4cc97a5ce7b7ef5669f95cc9208c4105c21fd8401fea5: Status 404 returned error can't find the container with id 9bc72d2f2a1df53748c4cc97a5ce7b7ef5669f95cc9208c4105c21fd8401fea5 Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.474903 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.527238 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" path="/var/lib/kubelet/pods/1d2e724b-ab17-45b3-a5ec-c43bf54e935d/volumes" Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.854022 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4ea7-account-create-update-ft8pf" event={"ID":"70d12fd2-0c0c-435c-863d-0b5445b67460","Type":"ContainerStarted","Data":"2f323283a12379d9ad16db99d200f0204fcc3773dff4ea67424ce12f553b510f"} Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.856550 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7e4b-account-create-update-k4vjz" event={"ID":"6c1c8d94-4154-402a-858d-fd819787ba8e","Type":"ContainerStarted","Data":"9bc72d2f2a1df53748c4cc97a5ce7b7ef5669f95cc9208c4105c21fd8401fea5"} Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.865118 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv" event={"ID":"083d8a82-cfe8-4bd9-b612-9466ca400e16","Type":"ContainerStarted","Data":"d34ece8e78fc6b4a7c865da3b6ed2de4ea995c4e3caaaf388c655d82bfd8b19d"} Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.868992 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m4mz2-config-jqq9r" event={"ID":"8094ae6e-4447-47a9-8e27-8d59856f6891","Type":"ContainerStarted","Data":"f3d1e36c470e88e27869c5df70860b4c0ff18f2a1236e262b7211607eecbd6e1"} Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.870909 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4mwp5" event={"ID":"4d0cf369-d1c2-495e-82d9-31d8f75b3538","Type":"ContainerStarted","Data":"9b26d73d2360b73483d19057cf34bc550a8f91e1cdbbfdf95b445b2fbd3d8419"} Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.870937 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4mwp5" event={"ID":"4d0cf369-d1c2-495e-82d9-31d8f75b3538","Type":"ContainerStarted","Data":"66fa3ddf009c67bf7921bcdb52687c1f1a9774631c254f2365dae42f459d1c4e"} Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.872687 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dsvvn" event={"ID":"1359b44b-6aa8-48f7-98e0-faea12df3d79","Type":"ContainerStarted","Data":"b6d3170930287c694320c20220796aa89c4bb5baa2144c22209ac12a4d50b59b"} Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.876101 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-72c9-account-create-update-27v75" event={"ID":"521498fd-fd08-4dc4-ba76-3a92a99cc7a1","Type":"ContainerStarted","Data":"7fcc60b22b499706025c54ae51817f31ee2c529a4748016a8a827861e33115bd"} Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.878905 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-w9lsr" event={"ID":"d6ed5dab-4be9-4c17-934d-75ec8a900d7c","Type":"ContainerStarted","Data":"0e44cb7d9e0cf607fab8de1dd8856075be28bf8047bd168a698aa30210ba4d03"} Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.878943 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-w9lsr" event={"ID":"d6ed5dab-4be9-4c17-934d-75ec8a900d7c","Type":"ContainerStarted","Data":"9c5c8ff3f38c777d76bed62b206657272063109597ee2663ece08e4e17c05356"} Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.880582 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b794f06f-38a0-4c4d-933b-db50f05ddfb8","Type":"ContainerStarted","Data":"52f429c93bf838ef34993eea6a42bbbea3cd7f8548ab9dca6ea779f03e8f0a49"} Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.882681 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2bed-account-create-update-kkzsn" event={"ID":"6da37e82-34e6-45a1-a1b8-a373467376a9","Type":"ContainerStarted","Data":"11955dce820a32ba18a1ae63aeebd0ec2ff178812fe6ca5cea2b117049212189"} Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.883934 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-prtgl" event={"ID":"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3","Type":"ContainerStarted","Data":"0a38d0e16a37427e6c605ba68e9a3606c133ae7397c4d61dafd25e12c11293cc"} Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.888337 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bf9d-account-create-update-q2tvd" event={"ID":"9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331","Type":"ContainerStarted","Data":"b937ccdff7beb998922dd7b7289054200f4a60b48484cf9da60a9a95701d295d"} Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.909882 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-4mwp5" podStartSLOduration=3.909863192 podStartE2EDuration="3.909863192s" podCreationTimestamp="2025-11-29 14:50:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:26.889577364 +0000 UTC m=+1324.876415016" watchObservedRunningTime="2025-11-29 14:50:26.909863192 +0000 UTC m=+1324.896700844" Nov 29 14:50:26 crc kubenswrapper[4907]: I1129 14:50:26.925220 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-w9lsr" podStartSLOduration=3.925202278 podStartE2EDuration="3.925202278s" podCreationTimestamp="2025-11-29 14:50:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:26.907969808 +0000 UTC m=+1324.894807470" watchObservedRunningTime="2025-11-29 14:50:26.925202278 +0000 UTC m=+1324.912039930" Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.455925 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="1d2e724b-ab17-45b3-a5ec-c43bf54e935d" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.137:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.669718 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nlrlx" Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.724924 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25e7ec61-ebc8-4a53-be29-9243e33b6ca7-operator-scripts\") pod \"25e7ec61-ebc8-4a53-be29-9243e33b6ca7\" (UID: \"25e7ec61-ebc8-4a53-be29-9243e33b6ca7\") " Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.725980 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwkf2\" (UniqueName: \"kubernetes.io/projected/25e7ec61-ebc8-4a53-be29-9243e33b6ca7-kube-api-access-mwkf2\") pod \"25e7ec61-ebc8-4a53-be29-9243e33b6ca7\" (UID: \"25e7ec61-ebc8-4a53-be29-9243e33b6ca7\") " Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.728051 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e7ec61-ebc8-4a53-be29-9243e33b6ca7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25e7ec61-ebc8-4a53-be29-9243e33b6ca7" (UID: "25e7ec61-ebc8-4a53-be29-9243e33b6ca7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.734415 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e7ec61-ebc8-4a53-be29-9243e33b6ca7-kube-api-access-mwkf2" (OuterVolumeSpecName: "kube-api-access-mwkf2") pod "25e7ec61-ebc8-4a53-be29-9243e33b6ca7" (UID: "25e7ec61-ebc8-4a53-be29-9243e33b6ca7"). InnerVolumeSpecName "kube-api-access-mwkf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.834031 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwkf2\" (UniqueName: \"kubernetes.io/projected/25e7ec61-ebc8-4a53-be29-9243e33b6ca7-kube-api-access-mwkf2\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.834058 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25e7ec61-ebc8-4a53-be29-9243e33b6ca7-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.935366 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dsvvn" event={"ID":"1359b44b-6aa8-48f7-98e0-faea12df3d79","Type":"ContainerStarted","Data":"26fb0f8b8acacb5aea1b72987c70e47593b178c5c4865c168661bc3c60010fa5"} Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.955058 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-72c9-account-create-update-27v75" event={"ID":"521498fd-fd08-4dc4-ba76-3a92a99cc7a1","Type":"ContainerStarted","Data":"dc5da379a81f564fd83c7672523bbf9843ea956a4c4babf58b27c97726b320d5"} Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.956758 4907 generic.go:334] "Generic (PLEG): container finished" podID="6da37e82-34e6-45a1-a1b8-a373467376a9" containerID="0a473041a4a5bec0db20c0ffae9453e70991bd3141a2c075d2510200c2eea27c" exitCode=0 Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.956805 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2bed-account-create-update-kkzsn" event={"ID":"6da37e82-34e6-45a1-a1b8-a373467376a9","Type":"ContainerDied","Data":"0a473041a4a5bec0db20c0ffae9453e70991bd3141a2c075d2510200c2eea27c"} Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.962367 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-dsvvn" podStartSLOduration=4.962357109 podStartE2EDuration="4.962357109s" podCreationTimestamp="2025-11-29 14:50:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:27.961684989 +0000 UTC m=+1325.948522641" watchObservedRunningTime="2025-11-29 14:50:27.962357109 +0000 UTC m=+1325.949194751" Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.965599 4907 generic.go:334] "Generic (PLEG): container finished" podID="9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331" containerID="d2edc9c9fb3ca2f844f121dfc2e6ed8f2336106e6e36b58f3e18e6ec1e24f83b" exitCode=0 Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.965675 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bf9d-account-create-update-q2tvd" event={"ID":"9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331","Type":"ContainerDied","Data":"d2edc9c9fb3ca2f844f121dfc2e6ed8f2336106e6e36b58f3e18e6ec1e24f83b"} Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.977314 4907 generic.go:334] "Generic (PLEG): container finished" podID="8094ae6e-4447-47a9-8e27-8d59856f6891" containerID="7fc160d1dcbba4c1f63142c2b38c370bcd6c3373d098a0ef4b6e8d749d144a31" exitCode=0 Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.977429 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m4mz2-config-jqq9r" event={"ID":"8094ae6e-4447-47a9-8e27-8d59856f6891","Type":"ContainerDied","Data":"7fc160d1dcbba4c1f63142c2b38c370bcd6c3373d098a0ef4b6e8d749d144a31"} Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.979841 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4ea7-account-create-update-ft8pf" event={"ID":"70d12fd2-0c0c-435c-863d-0b5445b67460","Type":"ContainerStarted","Data":"88961f8842b754e34e4ac8b6b42e57fc64097ffe28f55d74f54fb6858ebd082a"} Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.984006 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nlrlx" Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.984050 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nlrlx" event={"ID":"25e7ec61-ebc8-4a53-be29-9243e33b6ca7","Type":"ContainerDied","Data":"14e64a060995506b0b2d86a93dc0bb754cfde596c7e103e80fea2fcedc43ee52"} Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.984075 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14e64a060995506b0b2d86a93dc0bb754cfde596c7e103e80fea2fcedc43ee52" Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.986277 4907 generic.go:334] "Generic (PLEG): container finished" podID="d6ed5dab-4be9-4c17-934d-75ec8a900d7c" containerID="0e44cb7d9e0cf607fab8de1dd8856075be28bf8047bd168a698aa30210ba4d03" exitCode=0 Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.986354 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-w9lsr" event={"ID":"d6ed5dab-4be9-4c17-934d-75ec8a900d7c","Type":"ContainerDied","Data":"0e44cb7d9e0cf607fab8de1dd8856075be28bf8047bd168a698aa30210ba4d03"} Nov 29 14:50:27 crc kubenswrapper[4907]: I1129 14:50:27.996007 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7e4b-account-create-update-k4vjz" event={"ID":"6c1c8d94-4154-402a-858d-fd819787ba8e","Type":"ContainerStarted","Data":"9b1f9500838f026044ed9ddd30bf803d4f95c6d20f5fb3ca895f5f2dbe98f598"} Nov 29 14:50:28 crc kubenswrapper[4907]: I1129 14:50:28.002912 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-72c9-account-create-update-27v75" podStartSLOduration=4.002900164 podStartE2EDuration="4.002900164s" podCreationTimestamp="2025-11-29 14:50:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:27.992494437 +0000 UTC m=+1325.979332089" watchObservedRunningTime="2025-11-29 14:50:28.002900164 +0000 UTC m=+1325.989737816" Nov 29 14:50:28 crc kubenswrapper[4907]: I1129 14:50:28.014794 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv" event={"ID":"083d8a82-cfe8-4bd9-b612-9466ca400e16","Type":"ContainerStarted","Data":"4bab71bb196e838c8325409c8adc8d7d8f80faf4927773e4e43f51e88ea532cc"} Nov 29 14:50:28 crc kubenswrapper[4907]: I1129 14:50:28.023223 4907 generic.go:334] "Generic (PLEG): container finished" podID="4d0cf369-d1c2-495e-82d9-31d8f75b3538" containerID="9b26d73d2360b73483d19057cf34bc550a8f91e1cdbbfdf95b445b2fbd3d8419" exitCode=0 Nov 29 14:50:28 crc kubenswrapper[4907]: I1129 14:50:28.023277 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4mwp5" event={"ID":"4d0cf369-d1c2-495e-82d9-31d8f75b3538","Type":"ContainerDied","Data":"9b26d73d2360b73483d19057cf34bc550a8f91e1cdbbfdf95b445b2fbd3d8419"} Nov 29 14:50:28 crc kubenswrapper[4907]: I1129 14:50:28.030726 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-4ea7-account-create-update-ft8pf" podStartSLOduration=5.030705837 podStartE2EDuration="5.030705837s" podCreationTimestamp="2025-11-29 14:50:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:28.016335367 +0000 UTC m=+1326.003173019" watchObservedRunningTime="2025-11-29 14:50:28.030705837 +0000 UTC m=+1326.017543489" Nov 29 14:50:28 crc kubenswrapper[4907]: I1129 14:50:28.116117 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7e4b-account-create-update-k4vjz" podStartSLOduration=5.11609788 podStartE2EDuration="5.11609788s" podCreationTimestamp="2025-11-29 14:50:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:28.109900884 +0000 UTC m=+1326.096738546" watchObservedRunningTime="2025-11-29 14:50:28.11609788 +0000 UTC m=+1326.102935532" Nov 29 14:50:28 crc kubenswrapper[4907]: I1129 14:50:28.149369 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv" podStartSLOduration=4.149353588 podStartE2EDuration="4.149353588s" podCreationTimestamp="2025-11-29 14:50:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:28.14064147 +0000 UTC m=+1326.127479132" watchObservedRunningTime="2025-11-29 14:50:28.149353588 +0000 UTC m=+1326.136191230" Nov 29 14:50:28 crc kubenswrapper[4907]: I1129 14:50:28.368293 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-m4mz2" Nov 29 14:50:28 crc kubenswrapper[4907]: I1129 14:50:28.491079 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:50:28 crc kubenswrapper[4907]: I1129 14:50:28.491126 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.037025 4907 generic.go:334] "Generic (PLEG): container finished" podID="70d12fd2-0c0c-435c-863d-0b5445b67460" containerID="88961f8842b754e34e4ac8b6b42e57fc64097ffe28f55d74f54fb6858ebd082a" exitCode=0 Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.037096 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4ea7-account-create-update-ft8pf" event={"ID":"70d12fd2-0c0c-435c-863d-0b5445b67460","Type":"ContainerDied","Data":"88961f8842b754e34e4ac8b6b42e57fc64097ffe28f55d74f54fb6858ebd082a"} Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.040406 4907 generic.go:334] "Generic (PLEG): container finished" podID="1359b44b-6aa8-48f7-98e0-faea12df3d79" containerID="26fb0f8b8acacb5aea1b72987c70e47593b178c5c4865c168661bc3c60010fa5" exitCode=0 Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.040506 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dsvvn" event={"ID":"1359b44b-6aa8-48f7-98e0-faea12df3d79","Type":"ContainerDied","Data":"26fb0f8b8acacb5aea1b72987c70e47593b178c5c4865c168661bc3c60010fa5"} Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.042085 4907 generic.go:334] "Generic (PLEG): container finished" podID="521498fd-fd08-4dc4-ba76-3a92a99cc7a1" containerID="dc5da379a81f564fd83c7672523bbf9843ea956a4c4babf58b27c97726b320d5" exitCode=0 Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.042116 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-72c9-account-create-update-27v75" event={"ID":"521498fd-fd08-4dc4-ba76-3a92a99cc7a1","Type":"ContainerDied","Data":"dc5da379a81f564fd83c7672523bbf9843ea956a4c4babf58b27c97726b320d5"} Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.044303 4907 generic.go:334] "Generic (PLEG): container finished" podID="6c1c8d94-4154-402a-858d-fd819787ba8e" containerID="9b1f9500838f026044ed9ddd30bf803d4f95c6d20f5fb3ca895f5f2dbe98f598" exitCode=0 Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.044673 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7e4b-account-create-update-k4vjz" event={"ID":"6c1c8d94-4154-402a-858d-fd819787ba8e","Type":"ContainerDied","Data":"9b1f9500838f026044ed9ddd30bf803d4f95c6d20f5fb3ca895f5f2dbe98f598"} Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.047032 4907 generic.go:334] "Generic (PLEG): container finished" podID="083d8a82-cfe8-4bd9-b612-9466ca400e16" containerID="4bab71bb196e838c8325409c8adc8d7d8f80faf4927773e4e43f51e88ea532cc" exitCode=0 Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.047124 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv" event={"ID":"083d8a82-cfe8-4bd9-b612-9466ca400e16","Type":"ContainerDied","Data":"4bab71bb196e838c8325409c8adc8d7d8f80faf4927773e4e43f51e88ea532cc"} Nov 29 14:50:29 crc kubenswrapper[4907]: E1129 14:50:29.401092 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice/crio-e204c0da95eeaedd25ced9e7033f8d1f086de9a44f3410c7c1468fe57ede7599\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice\": RecentStats: unable to find data in memory cache]" Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.581681 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4mwp5" Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.715066 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d0cf369-d1c2-495e-82d9-31d8f75b3538-operator-scripts\") pod \"4d0cf369-d1c2-495e-82d9-31d8f75b3538\" (UID: \"4d0cf369-d1c2-495e-82d9-31d8f75b3538\") " Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.715764 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rht4q\" (UniqueName: \"kubernetes.io/projected/4d0cf369-d1c2-495e-82d9-31d8f75b3538-kube-api-access-rht4q\") pod \"4d0cf369-d1c2-495e-82d9-31d8f75b3538\" (UID: \"4d0cf369-d1c2-495e-82d9-31d8f75b3538\") " Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.716113 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d0cf369-d1c2-495e-82d9-31d8f75b3538-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d0cf369-d1c2-495e-82d9-31d8f75b3538" (UID: "4d0cf369-d1c2-495e-82d9-31d8f75b3538"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.716756 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d0cf369-d1c2-495e-82d9-31d8f75b3538-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.726968 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d0cf369-d1c2-495e-82d9-31d8f75b3538-kube-api-access-rht4q" (OuterVolumeSpecName: "kube-api-access-rht4q") pod "4d0cf369-d1c2-495e-82d9-31d8f75b3538" (UID: "4d0cf369-d1c2-495e-82d9-31d8f75b3538"). InnerVolumeSpecName "kube-api-access-rht4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:29 crc kubenswrapper[4907]: I1129 14:50:29.818433 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rht4q\" (UniqueName: \"kubernetes.io/projected/4d0cf369-d1c2-495e-82d9-31d8f75b3538-kube-api-access-rht4q\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:30 crc kubenswrapper[4907]: I1129 14:50:30.059200 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4mwp5" Nov 29 14:50:30 crc kubenswrapper[4907]: I1129 14:50:30.059243 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4mwp5" event={"ID":"4d0cf369-d1c2-495e-82d9-31d8f75b3538","Type":"ContainerDied","Data":"66fa3ddf009c67bf7921bcdb52687c1f1a9774631c254f2365dae42f459d1c4e"} Nov 29 14:50:30 crc kubenswrapper[4907]: I1129 14:50:30.059308 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66fa3ddf009c67bf7921bcdb52687c1f1a9774631c254f2365dae42f459d1c4e" Nov 29 14:50:31 crc kubenswrapper[4907]: I1129 14:50:31.071933 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b794f06f-38a0-4c4d-933b-db50f05ddfb8","Type":"ContainerStarted","Data":"b1dcf51e709874aab7f22e33ad835fb772240dbe871721a5be5fc8a3a9c5f57b"} Nov 29 14:50:31 crc kubenswrapper[4907]: I1129 14:50:31.149388 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:50:31 crc kubenswrapper[4907]: I1129 14:50:31.163454 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe027ad6-8a24-44b5-8bfb-732d5c8fe22a-etc-swift\") pod \"swift-storage-0\" (UID: \"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a\") " pod="openstack/swift-storage-0" Nov 29 14:50:31 crc kubenswrapper[4907]: I1129 14:50:31.215250 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.797379 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2bed-account-create-update-kkzsn" Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.804756 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv" Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.867345 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bf9d-account-create-update-q2tvd" Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.892425 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nsvf\" (UniqueName: \"kubernetes.io/projected/083d8a82-cfe8-4bd9-b612-9466ca400e16-kube-api-access-4nsvf\") pod \"083d8a82-cfe8-4bd9-b612-9466ca400e16\" (UID: \"083d8a82-cfe8-4bd9-b612-9466ca400e16\") " Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.892610 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6da37e82-34e6-45a1-a1b8-a373467376a9-operator-scripts\") pod \"6da37e82-34e6-45a1-a1b8-a373467376a9\" (UID: \"6da37e82-34e6-45a1-a1b8-a373467376a9\") " Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.892660 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pmzg\" (UniqueName: \"kubernetes.io/projected/6da37e82-34e6-45a1-a1b8-a373467376a9-kube-api-access-4pmzg\") pod \"6da37e82-34e6-45a1-a1b8-a373467376a9\" (UID: \"6da37e82-34e6-45a1-a1b8-a373467376a9\") " Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.892713 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndsqf\" (UniqueName: \"kubernetes.io/projected/9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331-kube-api-access-ndsqf\") pod \"9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331\" (UID: \"9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331\") " Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.892757 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331-operator-scripts\") pod \"9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331\" (UID: \"9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331\") " Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.892787 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/083d8a82-cfe8-4bd9-b612-9466ca400e16-operator-scripts\") pod \"083d8a82-cfe8-4bd9-b612-9466ca400e16\" (UID: \"083d8a82-cfe8-4bd9-b612-9466ca400e16\") " Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.894750 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/083d8a82-cfe8-4bd9-b612-9466ca400e16-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "083d8a82-cfe8-4bd9-b612-9466ca400e16" (UID: "083d8a82-cfe8-4bd9-b612-9466ca400e16"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.895994 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331" (UID: "9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.903717 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6da37e82-34e6-45a1-a1b8-a373467376a9-kube-api-access-4pmzg" (OuterVolumeSpecName: "kube-api-access-4pmzg") pod "6da37e82-34e6-45a1-a1b8-a373467376a9" (UID: "6da37e82-34e6-45a1-a1b8-a373467376a9"). InnerVolumeSpecName "kube-api-access-4pmzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.904219 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6da37e82-34e6-45a1-a1b8-a373467376a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6da37e82-34e6-45a1-a1b8-a373467376a9" (UID: "6da37e82-34e6-45a1-a1b8-a373467376a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.906862 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331-kube-api-access-ndsqf" (OuterVolumeSpecName: "kube-api-access-ndsqf") pod "9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331" (UID: "9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331"). InnerVolumeSpecName "kube-api-access-ndsqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.926487 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/083d8a82-cfe8-4bd9-b612-9466ca400e16-kube-api-access-4nsvf" (OuterVolumeSpecName: "kube-api-access-4nsvf") pod "083d8a82-cfe8-4bd9-b612-9466ca400e16" (UID: "083d8a82-cfe8-4bd9-b612-9466ca400e16"). InnerVolumeSpecName "kube-api-access-4nsvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.998422 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nsvf\" (UniqueName: \"kubernetes.io/projected/083d8a82-cfe8-4bd9-b612-9466ca400e16-kube-api-access-4nsvf\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.998472 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6da37e82-34e6-45a1-a1b8-a373467376a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.998481 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pmzg\" (UniqueName: \"kubernetes.io/projected/6da37e82-34e6-45a1-a1b8-a373467376a9-kube-api-access-4pmzg\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.998490 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndsqf\" (UniqueName: \"kubernetes.io/projected/9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331-kube-api-access-ndsqf\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.998498 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:32 crc kubenswrapper[4907]: I1129 14:50:32.998505 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/083d8a82-cfe8-4bd9-b612-9466ca400e16-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.056156 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.075503 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w9lsr" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.100245 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8094ae6e-4447-47a9-8e27-8d59856f6891-additional-scripts\") pod \"8094ae6e-4447-47a9-8e27-8d59856f6891\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.100301 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-run\") pod \"8094ae6e-4447-47a9-8e27-8d59856f6891\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.100382 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8094ae6e-4447-47a9-8e27-8d59856f6891-scripts\") pod \"8094ae6e-4447-47a9-8e27-8d59856f6891\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.100400 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-log-ovn\") pod \"8094ae6e-4447-47a9-8e27-8d59856f6891\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.100518 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-run" (OuterVolumeSpecName: "var-run") pod "8094ae6e-4447-47a9-8e27-8d59856f6891" (UID: "8094ae6e-4447-47a9-8e27-8d59856f6891"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.100593 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdfdc\" (UniqueName: \"kubernetes.io/projected/8094ae6e-4447-47a9-8e27-8d59856f6891-kube-api-access-wdfdc\") pod \"8094ae6e-4447-47a9-8e27-8d59856f6891\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.100608 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7e4b-account-create-update-k4vjz" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.100631 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8094ae6e-4447-47a9-8e27-8d59856f6891" (UID: "8094ae6e-4447-47a9-8e27-8d59856f6891"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.101189 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-run-ovn\") pod \"8094ae6e-4447-47a9-8e27-8d59856f6891\" (UID: \"8094ae6e-4447-47a9-8e27-8d59856f6891\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.101183 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8094ae6e-4447-47a9-8e27-8d59856f6891-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8094ae6e-4447-47a9-8e27-8d59856f6891" (UID: "8094ae6e-4447-47a9-8e27-8d59856f6891"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.101209 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8094ae6e-4447-47a9-8e27-8d59856f6891" (UID: "8094ae6e-4447-47a9-8e27-8d59856f6891"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.101220 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4l9s\" (UniqueName: \"kubernetes.io/projected/d6ed5dab-4be9-4c17-934d-75ec8a900d7c-kube-api-access-x4l9s\") pod \"d6ed5dab-4be9-4c17-934d-75ec8a900d7c\" (UID: \"d6ed5dab-4be9-4c17-934d-75ec8a900d7c\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.101313 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ed5dab-4be9-4c17-934d-75ec8a900d7c-operator-scripts\") pod \"d6ed5dab-4be9-4c17-934d-75ec8a900d7c\" (UID: \"d6ed5dab-4be9-4c17-934d-75ec8a900d7c\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.101511 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8094ae6e-4447-47a9-8e27-8d59856f6891-scripts" (OuterVolumeSpecName: "scripts") pod "8094ae6e-4447-47a9-8e27-8d59856f6891" (UID: "8094ae6e-4447-47a9-8e27-8d59856f6891"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.102417 4907 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8094ae6e-4447-47a9-8e27-8d59856f6891-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.102432 4907 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-run\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.102453 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8094ae6e-4447-47a9-8e27-8d59856f6891-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.102463 4907 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.102486 4907 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8094ae6e-4447-47a9-8e27-8d59856f6891-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.103780 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ed5dab-4be9-4c17-934d-75ec8a900d7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6ed5dab-4be9-4c17-934d-75ec8a900d7c" (UID: "d6ed5dab-4be9-4c17-934d-75ec8a900d7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.104581 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6ed5dab-4be9-4c17-934d-75ec8a900d7c-kube-api-access-x4l9s" (OuterVolumeSpecName: "kube-api-access-x4l9s") pod "d6ed5dab-4be9-4c17-934d-75ec8a900d7c" (UID: "d6ed5dab-4be9-4c17-934d-75ec8a900d7c"). InnerVolumeSpecName "kube-api-access-x4l9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.104627 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8094ae6e-4447-47a9-8e27-8d59856f6891-kube-api-access-wdfdc" (OuterVolumeSpecName: "kube-api-access-wdfdc") pod "8094ae6e-4447-47a9-8e27-8d59856f6891" (UID: "8094ae6e-4447-47a9-8e27-8d59856f6891"). InnerVolumeSpecName "kube-api-access-wdfdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.105466 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-72c9-account-create-update-27v75" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.107789 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4ea7-account-create-update-ft8pf" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.112231 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2bed-account-create-update-kkzsn" event={"ID":"6da37e82-34e6-45a1-a1b8-a373467376a9","Type":"ContainerDied","Data":"11955dce820a32ba18a1ae63aeebd0ec2ff178812fe6ca5cea2b117049212189"} Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.112259 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11955dce820a32ba18a1ae63aeebd0ec2ff178812fe6ca5cea2b117049212189" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.112328 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2bed-account-create-update-kkzsn" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.112890 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dsvvn" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.113622 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bf9d-account-create-update-q2tvd" event={"ID":"9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331","Type":"ContainerDied","Data":"b937ccdff7beb998922dd7b7289054200f4a60b48484cf9da60a9a95701d295d"} Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.113646 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b937ccdff7beb998922dd7b7289054200f4a60b48484cf9da60a9a95701d295d" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.113688 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bf9d-account-create-update-q2tvd" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.125888 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-72c9-account-create-update-27v75" event={"ID":"521498fd-fd08-4dc4-ba76-3a92a99cc7a1","Type":"ContainerDied","Data":"7fcc60b22b499706025c54ae51817f31ee2c529a4748016a8a827861e33115bd"} Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.125946 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fcc60b22b499706025c54ae51817f31ee2c529a4748016a8a827861e33115bd" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.126045 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-72c9-account-create-update-27v75" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.129793 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w9lsr" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.129830 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-w9lsr" event={"ID":"d6ed5dab-4be9-4c17-934d-75ec8a900d7c","Type":"ContainerDied","Data":"9c5c8ff3f38c777d76bed62b206657272063109597ee2663ece08e4e17c05356"} Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.129868 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c5c8ff3f38c777d76bed62b206657272063109597ee2663ece08e4e17c05356" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.141873 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.141871 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv" event={"ID":"083d8a82-cfe8-4bd9-b612-9466ca400e16","Type":"ContainerDied","Data":"d34ece8e78fc6b4a7c865da3b6ed2de4ea995c4e3caaaf388c655d82bfd8b19d"} Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.142711 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d34ece8e78fc6b4a7c865da3b6ed2de4ea995c4e3caaaf388c655d82bfd8b19d" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.153271 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4ea7-account-create-update-ft8pf" event={"ID":"70d12fd2-0c0c-435c-863d-0b5445b67460","Type":"ContainerDied","Data":"2f323283a12379d9ad16db99d200f0204fcc3773dff4ea67424ce12f553b510f"} Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.153309 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f323283a12379d9ad16db99d200f0204fcc3773dff4ea67424ce12f553b510f" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.153362 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4ea7-account-create-update-ft8pf" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.158701 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dsvvn" event={"ID":"1359b44b-6aa8-48f7-98e0-faea12df3d79","Type":"ContainerDied","Data":"b6d3170930287c694320c20220796aa89c4bb5baa2144c22209ac12a4d50b59b"} Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.158733 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6d3170930287c694320c20220796aa89c4bb5baa2144c22209ac12a4d50b59b" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.158744 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dsvvn" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.167271 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7e4b-account-create-update-k4vjz" event={"ID":"6c1c8d94-4154-402a-858d-fd819787ba8e","Type":"ContainerDied","Data":"9bc72d2f2a1df53748c4cc97a5ce7b7ef5669f95cc9208c4105c21fd8401fea5"} Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.167320 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bc72d2f2a1df53748c4cc97a5ce7b7ef5669f95cc9208c4105c21fd8401fea5" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.167388 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7e4b-account-create-update-k4vjz" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.176352 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m4mz2-config-jqq9r" event={"ID":"8094ae6e-4447-47a9-8e27-8d59856f6891","Type":"ContainerDied","Data":"f3d1e36c470e88e27869c5df70860b4c0ff18f2a1236e262b7211607eecbd6e1"} Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.176384 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3d1e36c470e88e27869c5df70860b4c0ff18f2a1236e262b7211607eecbd6e1" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.176426 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m4mz2-config-jqq9r" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.203874 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70d12fd2-0c0c-435c-863d-0b5445b67460-operator-scripts\") pod \"70d12fd2-0c0c-435c-863d-0b5445b67460\" (UID: \"70d12fd2-0c0c-435c-863d-0b5445b67460\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.203965 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45vpj\" (UniqueName: \"kubernetes.io/projected/1359b44b-6aa8-48f7-98e0-faea12df3d79-kube-api-access-45vpj\") pod \"1359b44b-6aa8-48f7-98e0-faea12df3d79\" (UID: \"1359b44b-6aa8-48f7-98e0-faea12df3d79\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.204038 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxzw2\" (UniqueName: \"kubernetes.io/projected/521498fd-fd08-4dc4-ba76-3a92a99cc7a1-kube-api-access-fxzw2\") pod \"521498fd-fd08-4dc4-ba76-3a92a99cc7a1\" (UID: \"521498fd-fd08-4dc4-ba76-3a92a99cc7a1\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.204058 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmrjq\" (UniqueName: \"kubernetes.io/projected/70d12fd2-0c0c-435c-863d-0b5445b67460-kube-api-access-bmrjq\") pod \"70d12fd2-0c0c-435c-863d-0b5445b67460\" (UID: \"70d12fd2-0c0c-435c-863d-0b5445b67460\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.204140 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c1c8d94-4154-402a-858d-fd819787ba8e-operator-scripts\") pod \"6c1c8d94-4154-402a-858d-fd819787ba8e\" (UID: \"6c1c8d94-4154-402a-858d-fd819787ba8e\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.204192 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjvk2\" (UniqueName: \"kubernetes.io/projected/6c1c8d94-4154-402a-858d-fd819787ba8e-kube-api-access-mjvk2\") pod \"6c1c8d94-4154-402a-858d-fd819787ba8e\" (UID: \"6c1c8d94-4154-402a-858d-fd819787ba8e\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.204287 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1359b44b-6aa8-48f7-98e0-faea12df3d79-operator-scripts\") pod \"1359b44b-6aa8-48f7-98e0-faea12df3d79\" (UID: \"1359b44b-6aa8-48f7-98e0-faea12df3d79\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.204319 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/521498fd-fd08-4dc4-ba76-3a92a99cc7a1-operator-scripts\") pod \"521498fd-fd08-4dc4-ba76-3a92a99cc7a1\" (UID: \"521498fd-fd08-4dc4-ba76-3a92a99cc7a1\") " Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.204950 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdfdc\" (UniqueName: \"kubernetes.io/projected/8094ae6e-4447-47a9-8e27-8d59856f6891-kube-api-access-wdfdc\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.204968 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4l9s\" (UniqueName: \"kubernetes.io/projected/d6ed5dab-4be9-4c17-934d-75ec8a900d7c-kube-api-access-x4l9s\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.204963 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c1c8d94-4154-402a-858d-fd819787ba8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c1c8d94-4154-402a-858d-fd819787ba8e" (UID: "6c1c8d94-4154-402a-858d-fd819787ba8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.204978 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ed5dab-4be9-4c17-934d-75ec8a900d7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.204992 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d12fd2-0c0c-435c-863d-0b5445b67460-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "70d12fd2-0c0c-435c-863d-0b5445b67460" (UID: "70d12fd2-0c0c-435c-863d-0b5445b67460"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.205036 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1359b44b-6aa8-48f7-98e0-faea12df3d79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1359b44b-6aa8-48f7-98e0-faea12df3d79" (UID: "1359b44b-6aa8-48f7-98e0-faea12df3d79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.207957 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/521498fd-fd08-4dc4-ba76-3a92a99cc7a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "521498fd-fd08-4dc4-ba76-3a92a99cc7a1" (UID: "521498fd-fd08-4dc4-ba76-3a92a99cc7a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.214953 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c1c8d94-4154-402a-858d-fd819787ba8e-kube-api-access-mjvk2" (OuterVolumeSpecName: "kube-api-access-mjvk2") pod "6c1c8d94-4154-402a-858d-fd819787ba8e" (UID: "6c1c8d94-4154-402a-858d-fd819787ba8e"). InnerVolumeSpecName "kube-api-access-mjvk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.217062 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70d12fd2-0c0c-435c-863d-0b5445b67460-kube-api-access-bmrjq" (OuterVolumeSpecName: "kube-api-access-bmrjq") pod "70d12fd2-0c0c-435c-863d-0b5445b67460" (UID: "70d12fd2-0c0c-435c-863d-0b5445b67460"). InnerVolumeSpecName "kube-api-access-bmrjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.232218 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/521498fd-fd08-4dc4-ba76-3a92a99cc7a1-kube-api-access-fxzw2" (OuterVolumeSpecName: "kube-api-access-fxzw2") pod "521498fd-fd08-4dc4-ba76-3a92a99cc7a1" (UID: "521498fd-fd08-4dc4-ba76-3a92a99cc7a1"). InnerVolumeSpecName "kube-api-access-fxzw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.232343 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1359b44b-6aa8-48f7-98e0-faea12df3d79-kube-api-access-45vpj" (OuterVolumeSpecName: "kube-api-access-45vpj") pod "1359b44b-6aa8-48f7-98e0-faea12df3d79" (UID: "1359b44b-6aa8-48f7-98e0-faea12df3d79"). InnerVolumeSpecName "kube-api-access-45vpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.305749 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c1c8d94-4154-402a-858d-fd819787ba8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.305776 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjvk2\" (UniqueName: \"kubernetes.io/projected/6c1c8d94-4154-402a-858d-fd819787ba8e-kube-api-access-mjvk2\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.305789 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1359b44b-6aa8-48f7-98e0-faea12df3d79-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.305800 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/521498fd-fd08-4dc4-ba76-3a92a99cc7a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.305808 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70d12fd2-0c0c-435c-863d-0b5445b67460-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.305817 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45vpj\" (UniqueName: \"kubernetes.io/projected/1359b44b-6aa8-48f7-98e0-faea12df3d79-kube-api-access-45vpj\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.305825 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxzw2\" (UniqueName: \"kubernetes.io/projected/521498fd-fd08-4dc4-ba76-3a92a99cc7a1-kube-api-access-fxzw2\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.305834 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmrjq\" (UniqueName: \"kubernetes.io/projected/70d12fd2-0c0c-435c-863d-0b5445b67460-kube-api-access-bmrjq\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:33 crc kubenswrapper[4907]: I1129 14:50:33.327360 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.195857 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"c97655a19eb040f0a460ac99d02a7a9efe2a616af41722648939bf41c7da03ae"} Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.243267 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-m4mz2-config-jqq9r"] Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.245140 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-prtgl" event={"ID":"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3","Type":"ContainerStarted","Data":"b944be8db7cc0750ea5e25b0a2d3ca806a386547d1944b7ffc6f2c350e953da2"} Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.267224 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-m4mz2-config-jqq9r"] Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.285311 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-prtgl" podStartSLOduration=4.303594753 podStartE2EDuration="11.285290348s" podCreationTimestamp="2025-11-29 14:50:23 +0000 UTC" firstStartedPulling="2025-11-29 14:50:25.877019295 +0000 UTC m=+1323.863856947" lastFinishedPulling="2025-11-29 14:50:32.85871489 +0000 UTC m=+1330.845552542" observedRunningTime="2025-11-29 14:50:34.265177925 +0000 UTC m=+1332.252015577" watchObservedRunningTime="2025-11-29 14:50:34.285290348 +0000 UTC m=+1332.272128000" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.496945 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8094ae6e-4447-47a9-8e27-8d59856f6891" path="/var/lib/kubelet/pods/8094ae6e-4447-47a9-8e27-8d59856f6891/volumes" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.858122 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Nov 29 14:50:34 crc kubenswrapper[4907]: E1129 14:50:34.858635 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da37e82-34e6-45a1-a1b8-a373467376a9" containerName="mariadb-account-create-update" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.858653 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da37e82-34e6-45a1-a1b8-a373467376a9" containerName="mariadb-account-create-update" Nov 29 14:50:34 crc kubenswrapper[4907]: E1129 14:50:34.858664 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0cf369-d1c2-495e-82d9-31d8f75b3538" containerName="mariadb-database-create" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.858672 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0cf369-d1c2-495e-82d9-31d8f75b3538" containerName="mariadb-database-create" Nov 29 14:50:34 crc kubenswrapper[4907]: E1129 14:50:34.858681 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1359b44b-6aa8-48f7-98e0-faea12df3d79" containerName="mariadb-database-create" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.858689 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1359b44b-6aa8-48f7-98e0-faea12df3d79" containerName="mariadb-database-create" Nov 29 14:50:34 crc kubenswrapper[4907]: E1129 14:50:34.858715 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="083d8a82-cfe8-4bd9-b612-9466ca400e16" containerName="mariadb-database-create" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.858722 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="083d8a82-cfe8-4bd9-b612-9466ca400e16" containerName="mariadb-database-create" Nov 29 14:50:34 crc kubenswrapper[4907]: E1129 14:50:34.858736 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6ed5dab-4be9-4c17-934d-75ec8a900d7c" containerName="mariadb-database-create" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.858744 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ed5dab-4be9-4c17-934d-75ec8a900d7c" containerName="mariadb-database-create" Nov 29 14:50:34 crc kubenswrapper[4907]: E1129 14:50:34.858762 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e7ec61-ebc8-4a53-be29-9243e33b6ca7" containerName="mariadb-database-create" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.858771 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e7ec61-ebc8-4a53-be29-9243e33b6ca7" containerName="mariadb-database-create" Nov 29 14:50:34 crc kubenswrapper[4907]: E1129 14:50:34.858790 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70d12fd2-0c0c-435c-863d-0b5445b67460" containerName="mariadb-account-create-update" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.858797 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d12fd2-0c0c-435c-863d-0b5445b67460" containerName="mariadb-account-create-update" Nov 29 14:50:34 crc kubenswrapper[4907]: E1129 14:50:34.858808 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8094ae6e-4447-47a9-8e27-8d59856f6891" containerName="ovn-config" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.858815 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8094ae6e-4447-47a9-8e27-8d59856f6891" containerName="ovn-config" Nov 29 14:50:34 crc kubenswrapper[4907]: E1129 14:50:34.858830 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331" containerName="mariadb-account-create-update" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.858837 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331" containerName="mariadb-account-create-update" Nov 29 14:50:34 crc kubenswrapper[4907]: E1129 14:50:34.858849 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="521498fd-fd08-4dc4-ba76-3a92a99cc7a1" containerName="mariadb-account-create-update" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.858857 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="521498fd-fd08-4dc4-ba76-3a92a99cc7a1" containerName="mariadb-account-create-update" Nov 29 14:50:34 crc kubenswrapper[4907]: E1129 14:50:34.858879 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c1c8d94-4154-402a-858d-fd819787ba8e" containerName="mariadb-account-create-update" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.858887 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1c8d94-4154-402a-858d-fd819787ba8e" containerName="mariadb-account-create-update" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.859118 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="083d8a82-cfe8-4bd9-b612-9466ca400e16" containerName="mariadb-database-create" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.859136 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d0cf369-d1c2-495e-82d9-31d8f75b3538" containerName="mariadb-database-create" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.859149 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6da37e82-34e6-45a1-a1b8-a373467376a9" containerName="mariadb-account-create-update" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.859160 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="521498fd-fd08-4dc4-ba76-3a92a99cc7a1" containerName="mariadb-account-create-update" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.859173 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6ed5dab-4be9-4c17-934d-75ec8a900d7c" containerName="mariadb-database-create" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.859190 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1359b44b-6aa8-48f7-98e0-faea12df3d79" containerName="mariadb-database-create" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.859207 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e7ec61-ebc8-4a53-be29-9243e33b6ca7" containerName="mariadb-database-create" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.859221 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="70d12fd2-0c0c-435c-863d-0b5445b67460" containerName="mariadb-account-create-update" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.859232 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331" containerName="mariadb-account-create-update" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.859244 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8094ae6e-4447-47a9-8e27-8d59856f6891" containerName="ovn-config" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.859251 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c1c8d94-4154-402a-858d-fd819787ba8e" containerName="mariadb-account-create-update" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.860223 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.871273 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.875870 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.951423 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a0a09a-72e9-4d7f-94cf-fe1717484497-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"27a0a09a-72e9-4d7f-94cf-fe1717484497\") " pod="openstack/mysqld-exporter-0" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.951593 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a0a09a-72e9-4d7f-94cf-fe1717484497-config-data\") pod \"mysqld-exporter-0\" (UID: \"27a0a09a-72e9-4d7f-94cf-fe1717484497\") " pod="openstack/mysqld-exporter-0" Nov 29 14:50:34 crc kubenswrapper[4907]: I1129 14:50:34.951613 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44ltd\" (UniqueName: \"kubernetes.io/projected/27a0a09a-72e9-4d7f-94cf-fe1717484497-kube-api-access-44ltd\") pod \"mysqld-exporter-0\" (UID: \"27a0a09a-72e9-4d7f-94cf-fe1717484497\") " pod="openstack/mysqld-exporter-0" Nov 29 14:50:35 crc kubenswrapper[4907]: I1129 14:50:35.053414 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a0a09a-72e9-4d7f-94cf-fe1717484497-config-data\") pod \"mysqld-exporter-0\" (UID: \"27a0a09a-72e9-4d7f-94cf-fe1717484497\") " pod="openstack/mysqld-exporter-0" Nov 29 14:50:35 crc kubenswrapper[4907]: I1129 14:50:35.053473 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44ltd\" (UniqueName: \"kubernetes.io/projected/27a0a09a-72e9-4d7f-94cf-fe1717484497-kube-api-access-44ltd\") pod \"mysqld-exporter-0\" (UID: \"27a0a09a-72e9-4d7f-94cf-fe1717484497\") " pod="openstack/mysqld-exporter-0" Nov 29 14:50:35 crc kubenswrapper[4907]: I1129 14:50:35.053518 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a0a09a-72e9-4d7f-94cf-fe1717484497-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"27a0a09a-72e9-4d7f-94cf-fe1717484497\") " pod="openstack/mysqld-exporter-0" Nov 29 14:50:35 crc kubenswrapper[4907]: I1129 14:50:35.062360 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a0a09a-72e9-4d7f-94cf-fe1717484497-config-data\") pod \"mysqld-exporter-0\" (UID: \"27a0a09a-72e9-4d7f-94cf-fe1717484497\") " pod="openstack/mysqld-exporter-0" Nov 29 14:50:35 crc kubenswrapper[4907]: I1129 14:50:35.070211 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a0a09a-72e9-4d7f-94cf-fe1717484497-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"27a0a09a-72e9-4d7f-94cf-fe1717484497\") " pod="openstack/mysqld-exporter-0" Nov 29 14:50:35 crc kubenswrapper[4907]: I1129 14:50:35.074170 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44ltd\" (UniqueName: \"kubernetes.io/projected/27a0a09a-72e9-4d7f-94cf-fe1717484497-kube-api-access-44ltd\") pod \"mysqld-exporter-0\" (UID: \"27a0a09a-72e9-4d7f-94cf-fe1717484497\") " pod="openstack/mysqld-exporter-0" Nov 29 14:50:35 crc kubenswrapper[4907]: I1129 14:50:35.204844 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Nov 29 14:50:38 crc kubenswrapper[4907]: I1129 14:50:38.306241 4907 generic.go:334] "Generic (PLEG): container finished" podID="b794f06f-38a0-4c4d-933b-db50f05ddfb8" containerID="b1dcf51e709874aab7f22e33ad835fb772240dbe871721a5be5fc8a3a9c5f57b" exitCode=0 Nov 29 14:50:38 crc kubenswrapper[4907]: I1129 14:50:38.306385 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b794f06f-38a0-4c4d-933b-db50f05ddfb8","Type":"ContainerDied","Data":"b1dcf51e709874aab7f22e33ad835fb772240dbe871721a5be5fc8a3a9c5f57b"} Nov 29 14:50:39 crc kubenswrapper[4907]: E1129 14:50:39.403242 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice/crio-e204c0da95eeaedd25ced9e7033f8d1f086de9a44f3410c7c1468fe57ede7599\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice\": RecentStats: unable to find data in memory cache]" Nov 29 14:50:39 crc kubenswrapper[4907]: E1129 14:50:39.446145 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice/crio-e204c0da95eeaedd25ced9e7033f8d1f086de9a44f3410c7c1468fe57ede7599\": RecentStats: unable to find data in memory cache]" Nov 29 14:50:40 crc kubenswrapper[4907]: I1129 14:50:40.329464 4907 generic.go:334] "Generic (PLEG): container finished" podID="9b47a96b-53c6-4e4e-b92f-a0c12c5310b3" containerID="b944be8db7cc0750ea5e25b0a2d3ca806a386547d1944b7ffc6f2c350e953da2" exitCode=0 Nov 29 14:50:40 crc kubenswrapper[4907]: I1129 14:50:40.329511 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-prtgl" event={"ID":"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3","Type":"ContainerDied","Data":"b944be8db7cc0750ea5e25b0a2d3ca806a386547d1944b7ffc6f2c350e953da2"} Nov 29 14:50:42 crc kubenswrapper[4907]: I1129 14:50:42.581296 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-prtgl" Nov 29 14:50:42 crc kubenswrapper[4907]: I1129 14:50:42.723813 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-combined-ca-bundle\") pod \"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3\" (UID: \"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3\") " Nov 29 14:50:42 crc kubenswrapper[4907]: I1129 14:50:42.724132 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwxdm\" (UniqueName: \"kubernetes.io/projected/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-kube-api-access-rwxdm\") pod \"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3\" (UID: \"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3\") " Nov 29 14:50:42 crc kubenswrapper[4907]: I1129 14:50:42.724156 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-config-data\") pod \"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3\" (UID: \"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3\") " Nov 29 14:50:42 crc kubenswrapper[4907]: I1129 14:50:42.731352 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-kube-api-access-rwxdm" (OuterVolumeSpecName: "kube-api-access-rwxdm") pod "9b47a96b-53c6-4e4e-b92f-a0c12c5310b3" (UID: "9b47a96b-53c6-4e4e-b92f-a0c12c5310b3"). InnerVolumeSpecName "kube-api-access-rwxdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:42 crc kubenswrapper[4907]: I1129 14:50:42.779704 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b47a96b-53c6-4e4e-b92f-a0c12c5310b3" (UID: "9b47a96b-53c6-4e4e-b92f-a0c12c5310b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:50:42 crc kubenswrapper[4907]: I1129 14:50:42.788601 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-config-data" (OuterVolumeSpecName: "config-data") pod "9b47a96b-53c6-4e4e-b92f-a0c12c5310b3" (UID: "9b47a96b-53c6-4e4e-b92f-a0c12c5310b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:50:42 crc kubenswrapper[4907]: I1129 14:50:42.826919 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:42 crc kubenswrapper[4907]: I1129 14:50:42.826962 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwxdm\" (UniqueName: \"kubernetes.io/projected/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-kube-api-access-rwxdm\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:42 crc kubenswrapper[4907]: I1129 14:50:42.826988 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:42 crc kubenswrapper[4907]: W1129 14:50:42.904248 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27a0a09a_72e9_4d7f_94cf_fe1717484497.slice/crio-e09f09d6ae4eb2316d97d4fb4e2015887acaeca76f3ca6a4f35ecd62ec28f393 WatchSource:0}: Error finding container e09f09d6ae4eb2316d97d4fb4e2015887acaeca76f3ca6a4f35ecd62ec28f393: Status 404 returned error can't find the container with id e09f09d6ae4eb2316d97d4fb4e2015887acaeca76f3ca6a4f35ecd62ec28f393 Nov 29 14:50:42 crc kubenswrapper[4907]: I1129 14:50:42.907087 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 14:50:42 crc kubenswrapper[4907]: I1129 14:50:42.908102 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.361890 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-prtgl" event={"ID":"9b47a96b-53c6-4e4e-b92f-a0c12c5310b3","Type":"ContainerDied","Data":"0a38d0e16a37427e6c605ba68e9a3606c133ae7397c4d61dafd25e12c11293cc"} Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.362222 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a38d0e16a37427e6c605ba68e9a3606c133ae7397c4d61dafd25e12c11293cc" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.363841 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-prtgl" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.387741 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"27a0a09a-72e9-4d7f-94cf-fe1717484497","Type":"ContainerStarted","Data":"e09f09d6ae4eb2316d97d4fb4e2015887acaeca76f3ca6a4f35ecd62ec28f393"} Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.394880 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b794f06f-38a0-4c4d-933b-db50f05ddfb8","Type":"ContainerStarted","Data":"a1d086b1857d5db93ecc52f0064eb68e7d1c33313a527987010f9791d629d63c"} Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.863412 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-2zbkj"] Nov 29 14:50:43 crc kubenswrapper[4907]: E1129 14:50:43.863907 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b47a96b-53c6-4e4e-b92f-a0c12c5310b3" containerName="keystone-db-sync" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.863920 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b47a96b-53c6-4e4e-b92f-a0c12c5310b3" containerName="keystone-db-sync" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.864099 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b47a96b-53c6-4e4e-b92f-a0c12c5310b3" containerName="keystone-db-sync" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.865110 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.873072 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-2zbkj"] Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.935057 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-d7255"] Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.936540 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.939763 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vcrpv" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.939990 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.940104 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.940463 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.940495 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.951870 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d7255"] Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.952542 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-2zbkj\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.952585 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-2zbkj\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.952617 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhxwt\" (UniqueName: \"kubernetes.io/projected/df83eea3-3e83-40bc-93e7-0342e4ffd99e-kube-api-access-hhxwt\") pod \"dnsmasq-dns-f877ddd87-2zbkj\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.952651 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-config\") pod \"dnsmasq-dns-f877ddd87-2zbkj\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:43 crc kubenswrapper[4907]: I1129 14:50:43.952732 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-dns-svc\") pod \"dnsmasq-dns-f877ddd87-2zbkj\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.029095 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-5vflp"] Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.030536 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-5vflp" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.033281 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-f988f" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.033875 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.045904 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-5vflp"] Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.054955 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-combined-ca-bundle\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.055061 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-2zbkj\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.055090 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-2zbkj\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.055110 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-scripts\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.055133 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-config-data\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.055153 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhxwt\" (UniqueName: \"kubernetes.io/projected/df83eea3-3e83-40bc-93e7-0342e4ffd99e-kube-api-access-hhxwt\") pod \"dnsmasq-dns-f877ddd87-2zbkj\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.055182 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-config\") pod \"dnsmasq-dns-f877ddd87-2zbkj\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.055201 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-credential-keys\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.055259 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-dns-svc\") pod \"dnsmasq-dns-f877ddd87-2zbkj\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.055273 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-fernet-keys\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.055292 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2vvz\" (UniqueName: \"kubernetes.io/projected/82e47dac-2b81-437c-9cae-05715ba34615-kube-api-access-l2vvz\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.056041 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-2zbkj\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.060529 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-config\") pod \"dnsmasq-dns-f877ddd87-2zbkj\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.061168 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-dns-svc\") pod \"dnsmasq-dns-f877ddd87-2zbkj\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.061342 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-2zbkj\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.097914 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-8lc4r"] Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.102245 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.108349 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fqcjq" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.108513 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.108642 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.119422 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhxwt\" (UniqueName: \"kubernetes.io/projected/df83eea3-3e83-40bc-93e7-0342e4ffd99e-kube-api-access-hhxwt\") pod \"dnsmasq-dns-f877ddd87-2zbkj\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.138508 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-4r59x"] Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.141233 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4r59x" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.157929 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.158163 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rrjq4" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.160536 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-fernet-keys\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.161165 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2vvz\" (UniqueName: \"kubernetes.io/projected/82e47dac-2b81-437c-9cae-05715ba34615-kube-api-access-l2vvz\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.161204 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-combined-ca-bundle\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.161380 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-scripts\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.158324 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.171456 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ltf5\" (UniqueName: \"kubernetes.io/projected/1e3e611f-5e4c-4b2c-baea-5f74745f315b-kube-api-access-9ltf5\") pod \"heat-db-sync-5vflp\" (UID: \"1e3e611f-5e4c-4b2c-baea-5f74745f315b\") " pod="openstack/heat-db-sync-5vflp" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.171512 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-config-data\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.171627 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-credential-keys\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.171739 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e3e611f-5e4c-4b2c-baea-5f74745f315b-config-data\") pod \"heat-db-sync-5vflp\" (UID: \"1e3e611f-5e4c-4b2c-baea-5f74745f315b\") " pod="openstack/heat-db-sync-5vflp" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.171754 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e3e611f-5e4c-4b2c-baea-5f74745f315b-combined-ca-bundle\") pod \"heat-db-sync-5vflp\" (UID: \"1e3e611f-5e4c-4b2c-baea-5f74745f315b\") " pod="openstack/heat-db-sync-5vflp" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.181121 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-scripts\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.183816 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-8lc4r"] Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.184136 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.194326 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-config-data\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.194579 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-fernet-keys\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.195829 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2vvz\" (UniqueName: \"kubernetes.io/projected/82e47dac-2b81-437c-9cae-05715ba34615-kube-api-access-l2vvz\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.195967 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-credential-keys\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.197792 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-combined-ca-bundle\") pod \"keystone-bootstrap-d7255\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.218233 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4r59x"] Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.275263 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9d771c9-5533-4167-adeb-f77c429ded79-config\") pod \"neutron-db-sync-4r59x\" (UID: \"a9d771c9-5533-4167-adeb-f77c429ded79\") " pod="openstack/neutron-db-sync-4r59x" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.275347 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-db-sync-config-data\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.275403 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx5k8\" (UniqueName: \"kubernetes.io/projected/a9d771c9-5533-4167-adeb-f77c429ded79-kube-api-access-zx5k8\") pod \"neutron-db-sync-4r59x\" (UID: \"a9d771c9-5533-4167-adeb-f77c429ded79\") " pod="openstack/neutron-db-sync-4r59x" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.275423 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-scripts\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.275454 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-config-data\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.275485 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ltf5\" (UniqueName: \"kubernetes.io/projected/1e3e611f-5e4c-4b2c-baea-5f74745f315b-kube-api-access-9ltf5\") pod \"heat-db-sync-5vflp\" (UID: \"1e3e611f-5e4c-4b2c-baea-5f74745f315b\") " pod="openstack/heat-db-sync-5vflp" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.275516 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvrdk\" (UniqueName: \"kubernetes.io/projected/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-kube-api-access-jvrdk\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.275561 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-etc-machine-id\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.275585 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e3e611f-5e4c-4b2c-baea-5f74745f315b-config-data\") pod \"heat-db-sync-5vflp\" (UID: \"1e3e611f-5e4c-4b2c-baea-5f74745f315b\") " pod="openstack/heat-db-sync-5vflp" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.275602 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e3e611f-5e4c-4b2c-baea-5f74745f315b-combined-ca-bundle\") pod \"heat-db-sync-5vflp\" (UID: \"1e3e611f-5e4c-4b2c-baea-5f74745f315b\") " pod="openstack/heat-db-sync-5vflp" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.275743 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-combined-ca-bundle\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.275836 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d771c9-5533-4167-adeb-f77c429ded79-combined-ca-bundle\") pod \"neutron-db-sync-4r59x\" (UID: \"a9d771c9-5533-4167-adeb-f77c429ded79\") " pod="openstack/neutron-db-sync-4r59x" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.280519 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e3e611f-5e4c-4b2c-baea-5f74745f315b-config-data\") pod \"heat-db-sync-5vflp\" (UID: \"1e3e611f-5e4c-4b2c-baea-5f74745f315b\") " pod="openstack/heat-db-sync-5vflp" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.286031 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e3e611f-5e4c-4b2c-baea-5f74745f315b-combined-ca-bundle\") pod \"heat-db-sync-5vflp\" (UID: \"1e3e611f-5e4c-4b2c-baea-5f74745f315b\") " pod="openstack/heat-db-sync-5vflp" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.290291 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-2zbkj"] Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.291384 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d7255" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.310558 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-6csw2"] Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.312044 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.317313 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.317576 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fm8g9" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.317680 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.330630 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6csw2"] Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.333158 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ltf5\" (UniqueName: \"kubernetes.io/projected/1e3e611f-5e4c-4b2c-baea-5f74745f315b-kube-api-access-9ltf5\") pod \"heat-db-sync-5vflp\" (UID: \"1e3e611f-5e4c-4b2c-baea-5f74745f315b\") " pod="openstack/heat-db-sync-5vflp" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.343305 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-tg5km"] Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.352864 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.354408 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-5vflp" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.356804 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-tg5km"] Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.369348 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-92jsx"] Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.373615 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-92jsx" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.380742 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-combined-ca-bundle\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.380791 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d771c9-5533-4167-adeb-f77c429ded79-combined-ca-bundle\") pod \"neutron-db-sync-4r59x\" (UID: \"a9d771c9-5533-4167-adeb-f77c429ded79\") " pod="openstack/neutron-db-sync-4r59x" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.380833 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9d771c9-5533-4167-adeb-f77c429ded79-config\") pod \"neutron-db-sync-4r59x\" (UID: \"a9d771c9-5533-4167-adeb-f77c429ded79\") " pod="openstack/neutron-db-sync-4r59x" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.380881 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-db-sync-config-data\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.380912 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx5k8\" (UniqueName: \"kubernetes.io/projected/a9d771c9-5533-4167-adeb-f77c429ded79-kube-api-access-zx5k8\") pod \"neutron-db-sync-4r59x\" (UID: \"a9d771c9-5533-4167-adeb-f77c429ded79\") " pod="openstack/neutron-db-sync-4r59x" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.380930 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-scripts\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.380948 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-config-data\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.381003 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvrdk\" (UniqueName: \"kubernetes.io/projected/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-kube-api-access-jvrdk\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.381052 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-etc-machine-id\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.381130 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-etc-machine-id\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.390215 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.390487 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xn9gj" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.393219 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-combined-ca-bundle\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.395425 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-db-sync-config-data\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.396109 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d771c9-5533-4167-adeb-f77c429ded79-combined-ca-bundle\") pod \"neutron-db-sync-4r59x\" (UID: \"a9d771c9-5533-4167-adeb-f77c429ded79\") " pod="openstack/neutron-db-sync-4r59x" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.397205 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-92jsx"] Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.398811 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-scripts\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.399068 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-config-data\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.403138 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx5k8\" (UniqueName: \"kubernetes.io/projected/a9d771c9-5533-4167-adeb-f77c429ded79-kube-api-access-zx5k8\") pod \"neutron-db-sync-4r59x\" (UID: \"a9d771c9-5533-4167-adeb-f77c429ded79\") " pod="openstack/neutron-db-sync-4r59x" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.416169 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9d771c9-5533-4167-adeb-f77c429ded79-config\") pod \"neutron-db-sync-4r59x\" (UID: \"a9d771c9-5533-4167-adeb-f77c429ded79\") " pod="openstack/neutron-db-sync-4r59x" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.429565 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvrdk\" (UniqueName: \"kubernetes.io/projected/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-kube-api-access-jvrdk\") pod \"cinder-db-sync-8lc4r\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.435633 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pwxst" event={"ID":"8d940ef0-0877-471d-906a-b6235392867d","Type":"ContainerStarted","Data":"99c9a12314c0a1e32c91651ac01ece45aec45412ee520201491f13fb17cc2ac2"} Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.483415 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-pwxst" podStartSLOduration=2.752891946 podStartE2EDuration="23.483396516s" podCreationTimestamp="2025-11-29 14:50:21 +0000 UTC" firstStartedPulling="2025-11-29 14:50:22.50328603 +0000 UTC m=+1320.490123682" lastFinishedPulling="2025-11-29 14:50:43.23379056 +0000 UTC m=+1341.220628252" observedRunningTime="2025-11-29 14:50:44.47759885 +0000 UTC m=+1342.464436502" watchObservedRunningTime="2025-11-29 14:50:44.483396516 +0000 UTC m=+1342.470234168" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.485025 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-scripts\") pod \"placement-db-sync-6csw2\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.485072 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-config-data\") pod \"placement-db-sync-6csw2\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.485126 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7gfb\" (UniqueName: \"kubernetes.io/projected/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-kube-api-access-g7gfb\") pod \"dnsmasq-dns-68dcc9cf6f-tg5km\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.485166 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-config\") pod \"dnsmasq-dns-68dcc9cf6f-tg5km\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.485339 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvrnk\" (UniqueName: \"kubernetes.io/projected/0089fab2-d07c-4dad-bce1-a4c085a35d24-kube-api-access-lvrnk\") pod \"barbican-db-sync-92jsx\" (UID: \"0089fab2-d07c-4dad-bce1-a4c085a35d24\") " pod="openstack/barbican-db-sync-92jsx" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.485384 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-tg5km\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.485419 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de06d815-0165-4c7a-aeed-fda3a647ba27-logs\") pod \"placement-db-sync-6csw2\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.485508 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-tg5km\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.485566 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-tg5km\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.485618 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0089fab2-d07c-4dad-bce1-a4c085a35d24-combined-ca-bundle\") pod \"barbican-db-sync-92jsx\" (UID: \"0089fab2-d07c-4dad-bce1-a4c085a35d24\") " pod="openstack/barbican-db-sync-92jsx" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.485678 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qw7t\" (UniqueName: \"kubernetes.io/projected/de06d815-0165-4c7a-aeed-fda3a647ba27-kube-api-access-4qw7t\") pod \"placement-db-sync-6csw2\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.485760 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0089fab2-d07c-4dad-bce1-a4c085a35d24-db-sync-config-data\") pod \"barbican-db-sync-92jsx\" (UID: \"0089fab2-d07c-4dad-bce1-a4c085a35d24\") " pod="openstack/barbican-db-sync-92jsx" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.485856 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-combined-ca-bundle\") pod \"placement-db-sync-6csw2\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.587800 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-tg5km\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.587887 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-tg5km\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.587929 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0089fab2-d07c-4dad-bce1-a4c085a35d24-combined-ca-bundle\") pod \"barbican-db-sync-92jsx\" (UID: \"0089fab2-d07c-4dad-bce1-a4c085a35d24\") " pod="openstack/barbican-db-sync-92jsx" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.587967 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qw7t\" (UniqueName: \"kubernetes.io/projected/de06d815-0165-4c7a-aeed-fda3a647ba27-kube-api-access-4qw7t\") pod \"placement-db-sync-6csw2\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.588024 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0089fab2-d07c-4dad-bce1-a4c085a35d24-db-sync-config-data\") pod \"barbican-db-sync-92jsx\" (UID: \"0089fab2-d07c-4dad-bce1-a4c085a35d24\") " pod="openstack/barbican-db-sync-92jsx" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.588099 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-combined-ca-bundle\") pod \"placement-db-sync-6csw2\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.588125 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-config-data\") pod \"placement-db-sync-6csw2\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.588148 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-scripts\") pod \"placement-db-sync-6csw2\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.588206 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7gfb\" (UniqueName: \"kubernetes.io/projected/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-kube-api-access-g7gfb\") pod \"dnsmasq-dns-68dcc9cf6f-tg5km\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.588237 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-config\") pod \"dnsmasq-dns-68dcc9cf6f-tg5km\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.588364 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvrnk\" (UniqueName: \"kubernetes.io/projected/0089fab2-d07c-4dad-bce1-a4c085a35d24-kube-api-access-lvrnk\") pod \"barbican-db-sync-92jsx\" (UID: \"0089fab2-d07c-4dad-bce1-a4c085a35d24\") " pod="openstack/barbican-db-sync-92jsx" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.588407 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-tg5km\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.588453 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de06d815-0165-4c7a-aeed-fda3a647ba27-logs\") pod \"placement-db-sync-6csw2\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.590960 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-tg5km\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.592157 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de06d815-0165-4c7a-aeed-fda3a647ba27-logs\") pod \"placement-db-sync-6csw2\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.592372 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-tg5km\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.592998 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-tg5km\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.596608 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-config-data\") pod \"placement-db-sync-6csw2\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.598586 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-combined-ca-bundle\") pod \"placement-db-sync-6csw2\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.599169 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0089fab2-d07c-4dad-bce1-a4c085a35d24-combined-ca-bundle\") pod \"barbican-db-sync-92jsx\" (UID: \"0089fab2-d07c-4dad-bce1-a4c085a35d24\") " pod="openstack/barbican-db-sync-92jsx" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.599260 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-config\") pod \"dnsmasq-dns-68dcc9cf6f-tg5km\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.603557 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-scripts\") pod \"placement-db-sync-6csw2\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.610009 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvrnk\" (UniqueName: \"kubernetes.io/projected/0089fab2-d07c-4dad-bce1-a4c085a35d24-kube-api-access-lvrnk\") pod \"barbican-db-sync-92jsx\" (UID: \"0089fab2-d07c-4dad-bce1-a4c085a35d24\") " pod="openstack/barbican-db-sync-92jsx" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.617883 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0089fab2-d07c-4dad-bce1-a4c085a35d24-db-sync-config-data\") pod \"barbican-db-sync-92jsx\" (UID: \"0089fab2-d07c-4dad-bce1-a4c085a35d24\") " pod="openstack/barbican-db-sync-92jsx" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.618848 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qw7t\" (UniqueName: \"kubernetes.io/projected/de06d815-0165-4c7a-aeed-fda3a647ba27-kube-api-access-4qw7t\") pod \"placement-db-sync-6csw2\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.622718 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7gfb\" (UniqueName: \"kubernetes.io/projected/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-kube-api-access-g7gfb\") pod \"dnsmasq-dns-68dcc9cf6f-tg5km\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.640955 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.663046 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4r59x" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.771618 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.773949 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.776529 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.787033 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.808913 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6csw2" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.811608 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.822914 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.870961 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-92jsx" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.912693 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-run-httpd\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.912744 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-log-httpd\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.912794 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-scripts\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.912837 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.912867 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.912895 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-config-data\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:44 crc kubenswrapper[4907]: I1129 14:50:44.912952 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7vb4\" (UniqueName: \"kubernetes.io/projected/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-kube-api-access-q7vb4\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:45 crc kubenswrapper[4907]: I1129 14:50:45.014874 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:45 crc kubenswrapper[4907]: I1129 14:50:45.014915 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:45 crc kubenswrapper[4907]: I1129 14:50:45.014945 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-config-data\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:45 crc kubenswrapper[4907]: I1129 14:50:45.015015 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7vb4\" (UniqueName: \"kubernetes.io/projected/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-kube-api-access-q7vb4\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:45 crc kubenswrapper[4907]: I1129 14:50:45.015078 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-run-httpd\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:45 crc kubenswrapper[4907]: I1129 14:50:45.015100 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-log-httpd\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:45 crc kubenswrapper[4907]: I1129 14:50:45.015800 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-log-httpd\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:45 crc kubenswrapper[4907]: I1129 14:50:45.015973 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-run-httpd\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:45 crc kubenswrapper[4907]: I1129 14:50:45.017504 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-scripts\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:45 crc kubenswrapper[4907]: I1129 14:50:45.019667 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:45 crc kubenswrapper[4907]: I1129 14:50:45.020198 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:45 crc kubenswrapper[4907]: I1129 14:50:45.020726 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-scripts\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:45 crc kubenswrapper[4907]: I1129 14:50:45.022985 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-config-data\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:45 crc kubenswrapper[4907]: I1129 14:50:45.035807 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7vb4\" (UniqueName: \"kubernetes.io/projected/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-kube-api-access-q7vb4\") pod \"ceilometer-0\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " pod="openstack/ceilometer-0" Nov 29 14:50:45 crc kubenswrapper[4907]: I1129 14:50:45.091484 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:50:46 crc kubenswrapper[4907]: I1129 14:50:46.559743 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b794f06f-38a0-4c4d-933b-db50f05ddfb8","Type":"ContainerStarted","Data":"2d64196d5831d2d908ba9803f731d1c856fb71abedf94a965e8fccebdf30c76f"} Nov 29 14:50:46 crc kubenswrapper[4907]: I1129 14:50:46.625214 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-5vflp"] Nov 29 14:50:46 crc kubenswrapper[4907]: I1129 14:50:46.683473 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:50:47 crc kubenswrapper[4907]: W1129 14:50:47.124505 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1e527c4_b1c6_4f80_a37c_a6fe6ab8c5de.slice/crio-b78c3106b9c871c610e3f0abdd66d771187193ceaba843801c1d35e0cac031e2 WatchSource:0}: Error finding container b78c3106b9c871c610e3f0abdd66d771187193ceaba843801c1d35e0cac031e2: Status 404 returned error can't find the container with id b78c3106b9c871c610e3f0abdd66d771187193ceaba843801c1d35e0cac031e2 Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.144594 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-8lc4r"] Nov 29 14:50:47 crc kubenswrapper[4907]: W1129 14:50:47.173588 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7c3295c_d537_4302_80c1_ce39f0f4fcb4.slice/crio-499711dd8cfbdb0f523b9fc9312dece2df6849116f91146861cc3a366233d358 WatchSource:0}: Error finding container 499711dd8cfbdb0f523b9fc9312dece2df6849116f91146861cc3a366233d358: Status 404 returned error can't find the container with id 499711dd8cfbdb0f523b9fc9312dece2df6849116f91146861cc3a366233d358 Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.180519 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-92jsx"] Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.190497 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-tg5km"] Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.209333 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.244727 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4r59x"] Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.509789 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-2zbkj"] Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.520261 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d7255"] Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.561017 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6csw2"] Nov 29 14:50:47 crc kubenswrapper[4907]: W1129 14:50:47.598503 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde06d815_0165_4c7a_aeed_fda3a647ba27.slice/crio-b03306e63e50ef4f821ce689af421c66c0187f5f9914317856308aff8f1efc96 WatchSource:0}: Error finding container b03306e63e50ef4f821ce689af421c66c0187f5f9914317856308aff8f1efc96: Status 404 returned error can't find the container with id b03306e63e50ef4f821ce689af421c66c0187f5f9914317856308aff8f1efc96 Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.672102 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b794f06f-38a0-4c4d-933b-db50f05ddfb8","Type":"ContainerStarted","Data":"acdb3e7fc11c2a3c229ccf7dbd3c62129a79f767ea7f1c8ea9e817ba3dcd8d1e"} Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.692827 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d7255" event={"ID":"82e47dac-2b81-437c-9cae-05715ba34615","Type":"ContainerStarted","Data":"b99e821f18413a606f9207bfe6780ea7bb0c00fc8d2da122bc4b93a3ba227053"} Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.697402 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" event={"ID":"df83eea3-3e83-40bc-93e7-0342e4ffd99e","Type":"ContainerStarted","Data":"65d4316da4e537459a6f0224901eee0fdd38103204a7d63112184552fc2cf3f3"} Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.699880 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-92jsx" event={"ID":"0089fab2-d07c-4dad-bce1-a4c085a35d24","Type":"ContainerStarted","Data":"6e1bdc255e22cd5ad1e335ebf3516d3911678d6bfb24a607c302be37f9fac3db"} Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.701316 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"27a0a09a-72e9-4d7f-94cf-fe1717484497","Type":"ContainerStarted","Data":"de4679883cddae5aaa50a227b5de2694df612001c9159facbbd6ba65cf25694b"} Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.702148 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-5vflp" event={"ID":"1e3e611f-5e4c-4b2c-baea-5f74745f315b","Type":"ContainerStarted","Data":"07f770c7fe3a0c5269fb3c051e3bf155278a48c4bf21f27bc8132761ab769df0"} Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.703617 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"a95b71133252ce0183f0040e5200773131cc78a4454ce214163c831103557c96"} Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.704540 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8lc4r" event={"ID":"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de","Type":"ContainerStarted","Data":"b78c3106b9c871c610e3f0abdd66d771187193ceaba843801c1d35e0cac031e2"} Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.705381 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4r59x" event={"ID":"a9d771c9-5533-4167-adeb-f77c429ded79","Type":"ContainerStarted","Data":"0e6c94f2b855540ddfda8a8d6971066a278f3d69e3ef8e3619621f126bed0f8f"} Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.706793 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" event={"ID":"aeb5490b-3d2d-4527-91f5-b6a93c7fed08","Type":"ContainerStarted","Data":"a106873ef5938b3b19fd9d89020d5b47c2dc5fd43ac79982ecc7ea3ed9961f28"} Nov 29 14:50:47 crc kubenswrapper[4907]: I1129 14:50:47.710664 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7c3295c-d537-4302-80c1-ce39f0f4fcb4","Type":"ContainerStarted","Data":"499711dd8cfbdb0f523b9fc9312dece2df6849116f91146861cc3a366233d358"} Nov 29 14:50:48 crc kubenswrapper[4907]: E1129 14:50:48.261968 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice/crio-e204c0da95eeaedd25ced9e7033f8d1f086de9a44f3410c7c1468fe57ede7599\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice\": RecentStats: unable to find data in memory cache]" Nov 29 14:50:48 crc kubenswrapper[4907]: E1129 14:50:48.263997 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice/crio-e204c0da95eeaedd25ced9e7033f8d1f086de9a44f3410c7c1468fe57ede7599\": RecentStats: unable to find data in memory cache]" Nov 29 14:50:48 crc kubenswrapper[4907]: I1129 14:50:48.721107 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6csw2" event={"ID":"de06d815-0165-4c7a-aeed-fda3a647ba27","Type":"ContainerStarted","Data":"b03306e63e50ef4f821ce689af421c66c0187f5f9914317856308aff8f1efc96"} Nov 29 14:50:49 crc kubenswrapper[4907]: E1129 14:50:49.499527 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice/crio-e204c0da95eeaedd25ced9e7033f8d1f086de9a44f3410c7c1468fe57ede7599\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice\": RecentStats: unable to find data in memory cache]" Nov 29 14:50:49 crc kubenswrapper[4907]: I1129 14:50:49.758468 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" event={"ID":"df83eea3-3e83-40bc-93e7-0342e4ffd99e","Type":"ContainerStarted","Data":"9be3cba3657d965e451a84c4a4c685107cc258b8e45f7bbf76a34969a4626c22"} Nov 29 14:50:49 crc kubenswrapper[4907]: I1129 14:50:49.758589 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" podUID="df83eea3-3e83-40bc-93e7-0342e4ffd99e" containerName="init" containerID="cri-o://9be3cba3657d965e451a84c4a4c685107cc258b8e45f7bbf76a34969a4626c22" gracePeriod=10 Nov 29 14:50:49 crc kubenswrapper[4907]: I1129 14:50:49.766292 4907 generic.go:334] "Generic (PLEG): container finished" podID="aeb5490b-3d2d-4527-91f5-b6a93c7fed08" containerID="7965bddea3532b07e619eb7e7edd180e8e0b0b8c3f238bf0a9032a01221da145" exitCode=0 Nov 29 14:50:49 crc kubenswrapper[4907]: I1129 14:50:49.766420 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" event={"ID":"aeb5490b-3d2d-4527-91f5-b6a93c7fed08","Type":"ContainerDied","Data":"7965bddea3532b07e619eb7e7edd180e8e0b0b8c3f238bf0a9032a01221da145"} Nov 29 14:50:49 crc kubenswrapper[4907]: I1129 14:50:49.770969 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"58dfc564c2f11b54d664d3eaa33fbad2c00af2f3006fbe51cbcf36c2d2af6ec3"} Nov 29 14:50:49 crc kubenswrapper[4907]: I1129 14:50:49.784325 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d7255" event={"ID":"82e47dac-2b81-437c-9cae-05715ba34615","Type":"ContainerStarted","Data":"ebf036c804778e50de4af4602abd5562148a2b93f8da8c6dd577baf9978c623b"} Nov 29 14:50:49 crc kubenswrapper[4907]: I1129 14:50:49.787366 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4r59x" event={"ID":"a9d771c9-5533-4167-adeb-f77c429ded79","Type":"ContainerStarted","Data":"47ca5682e6e0095166625dcc959b72109edbfc7c8cea1897f03314bc87ac9bf9"} Nov 29 14:50:49 crc kubenswrapper[4907]: I1129 14:50:49.869402 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-4r59x" podStartSLOduration=5.869382251 podStartE2EDuration="5.869382251s" podCreationTimestamp="2025-11-29 14:50:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:49.821304541 +0000 UTC m=+1347.808142213" watchObservedRunningTime="2025-11-29 14:50:49.869382251 +0000 UTC m=+1347.856219903" Nov 29 14:50:49 crc kubenswrapper[4907]: I1129 14:50:49.889259 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-d7255" podStartSLOduration=6.889235487 podStartE2EDuration="6.889235487s" podCreationTimestamp="2025-11-29 14:50:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:49.844283846 +0000 UTC m=+1347.831121498" watchObservedRunningTime="2025-11-29 14:50:49.889235487 +0000 UTC m=+1347.876073149" Nov 29 14:50:49 crc kubenswrapper[4907]: I1129 14:50:49.898310 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=12.666847576 podStartE2EDuration="15.898282955s" podCreationTimestamp="2025-11-29 14:50:34 +0000 UTC" firstStartedPulling="2025-11-29 14:50:42.906458651 +0000 UTC m=+1340.893296303" lastFinishedPulling="2025-11-29 14:50:46.13789401 +0000 UTC m=+1344.124731682" observedRunningTime="2025-11-29 14:50:49.864290246 +0000 UTC m=+1347.851127898" watchObservedRunningTime="2025-11-29 14:50:49.898282955 +0000 UTC m=+1347.885120607" Nov 29 14:50:49 crc kubenswrapper[4907]: I1129 14:50:49.907419 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=24.907399895 podStartE2EDuration="24.907399895s" podCreationTimestamp="2025-11-29 14:50:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:49.897960236 +0000 UTC m=+1347.884797888" watchObservedRunningTime="2025-11-29 14:50:49.907399895 +0000 UTC m=+1347.894237547" Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.543682 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.653548 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.681252 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhxwt\" (UniqueName: \"kubernetes.io/projected/df83eea3-3e83-40bc-93e7-0342e4ffd99e-kube-api-access-hhxwt\") pod \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.681309 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-ovsdbserver-sb\") pod \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.681429 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-dns-svc\") pod \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.681752 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-ovsdbserver-nb\") pod \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.681803 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-config\") pod \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.690675 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df83eea3-3e83-40bc-93e7-0342e4ffd99e-kube-api-access-hhxwt" (OuterVolumeSpecName: "kube-api-access-hhxwt") pod "df83eea3-3e83-40bc-93e7-0342e4ffd99e" (UID: "df83eea3-3e83-40bc-93e7-0342e4ffd99e"). InnerVolumeSpecName "kube-api-access-hhxwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.719188 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df83eea3-3e83-40bc-93e7-0342e4ffd99e" (UID: "df83eea3-3e83-40bc-93e7-0342e4ffd99e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.743356 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df83eea3-3e83-40bc-93e7-0342e4ffd99e" (UID: "df83eea3-3e83-40bc-93e7-0342e4ffd99e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:50 crc kubenswrapper[4907]: E1129 14:50:50.764496 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-dns-svc podName:df83eea3-3e83-40bc-93e7-0342e4ffd99e nodeName:}" failed. No retries permitted until 2025-11-29 14:50:51.264472232 +0000 UTC m=+1349.251309884 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-dns-svc") pod "df83eea3-3e83-40bc-93e7-0342e4ffd99e" (UID: "df83eea3-3e83-40bc-93e7-0342e4ffd99e") : error deleting /var/lib/kubelet/pods/df83eea3-3e83-40bc-93e7-0342e4ffd99e/volume-subpaths: remove /var/lib/kubelet/pods/df83eea3-3e83-40bc-93e7-0342e4ffd99e/volume-subpaths: no such file or directory Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.764760 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-config" (OuterVolumeSpecName: "config") pod "df83eea3-3e83-40bc-93e7-0342e4ffd99e" (UID: "df83eea3-3e83-40bc-93e7-0342e4ffd99e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.785187 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.785216 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhxwt\" (UniqueName: \"kubernetes.io/projected/df83eea3-3e83-40bc-93e7-0342e4ffd99e-kube-api-access-hhxwt\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.785226 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.785234 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.868779 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" event={"ID":"aeb5490b-3d2d-4527-91f5-b6a93c7fed08","Type":"ContainerStarted","Data":"526e56fe0320227fd4345fc96fa4dc53890c9e5c0791bb167e38b075ab06fbb1"} Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.870147 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.877121 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"6b6f7fdbe087ff8446f66fa00b8c87123f3e3ff16c7fdc87876baf423da75c20"} Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.877172 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"b5539f245236ec33d4d3f923223fa927c33b82d84a5dc62f8fc739ca15e9e397"} Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.888616 4907 generic.go:334] "Generic (PLEG): container finished" podID="df83eea3-3e83-40bc-93e7-0342e4ffd99e" containerID="9be3cba3657d965e451a84c4a4c685107cc258b8e45f7bbf76a34969a4626c22" exitCode=0 Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.888977 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.889327 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" event={"ID":"df83eea3-3e83-40bc-93e7-0342e4ffd99e","Type":"ContainerDied","Data":"9be3cba3657d965e451a84c4a4c685107cc258b8e45f7bbf76a34969a4626c22"} Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.889358 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-2zbkj" event={"ID":"df83eea3-3e83-40bc-93e7-0342e4ffd99e","Type":"ContainerDied","Data":"65d4316da4e537459a6f0224901eee0fdd38103204a7d63112184552fc2cf3f3"} Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.889372 4907 scope.go:117] "RemoveContainer" containerID="9be3cba3657d965e451a84c4a4c685107cc258b8e45f7bbf76a34969a4626c22" Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.903670 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" podStartSLOduration=6.903645009 podStartE2EDuration="6.903645009s" podCreationTimestamp="2025-11-29 14:50:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:50:50.885953885 +0000 UTC m=+1348.872791557" watchObservedRunningTime="2025-11-29 14:50:50.903645009 +0000 UTC m=+1348.890482661" Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.945087 4907 scope.go:117] "RemoveContainer" containerID="9be3cba3657d965e451a84c4a4c685107cc258b8e45f7bbf76a34969a4626c22" Nov 29 14:50:50 crc kubenswrapper[4907]: E1129 14:50:50.945686 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9be3cba3657d965e451a84c4a4c685107cc258b8e45f7bbf76a34969a4626c22\": container with ID starting with 9be3cba3657d965e451a84c4a4c685107cc258b8e45f7bbf76a34969a4626c22 not found: ID does not exist" containerID="9be3cba3657d965e451a84c4a4c685107cc258b8e45f7bbf76a34969a4626c22" Nov 29 14:50:50 crc kubenswrapper[4907]: I1129 14:50:50.945716 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be3cba3657d965e451a84c4a4c685107cc258b8e45f7bbf76a34969a4626c22"} err="failed to get container status \"9be3cba3657d965e451a84c4a4c685107cc258b8e45f7bbf76a34969a4626c22\": rpc error: code = NotFound desc = could not find container \"9be3cba3657d965e451a84c4a4c685107cc258b8e45f7bbf76a34969a4626c22\": container with ID starting with 9be3cba3657d965e451a84c4a4c685107cc258b8e45f7bbf76a34969a4626c22 not found: ID does not exist" Nov 29 14:50:51 crc kubenswrapper[4907]: I1129 14:50:51.307041 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-dns-svc\") pod \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\" (UID: \"df83eea3-3e83-40bc-93e7-0342e4ffd99e\") " Nov 29 14:50:51 crc kubenswrapper[4907]: I1129 14:50:51.307582 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df83eea3-3e83-40bc-93e7-0342e4ffd99e" (UID: "df83eea3-3e83-40bc-93e7-0342e4ffd99e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:50:51 crc kubenswrapper[4907]: I1129 14:50:51.308479 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df83eea3-3e83-40bc-93e7-0342e4ffd99e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:50:51 crc kubenswrapper[4907]: I1129 14:50:51.625964 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-2zbkj"] Nov 29 14:50:51 crc kubenswrapper[4907]: I1129 14:50:51.646122 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-2zbkj"] Nov 29 14:50:52 crc kubenswrapper[4907]: I1129 14:50:52.494568 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df83eea3-3e83-40bc-93e7-0342e4ffd99e" path="/var/lib/kubelet/pods/df83eea3-3e83-40bc-93e7-0342e4ffd99e/volumes" Nov 29 14:50:54 crc kubenswrapper[4907]: E1129 14:50:54.463430 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice/crio-e204c0da95eeaedd25ced9e7033f8d1f086de9a44f3410c7c1468fe57ede7599\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice\": RecentStats: unable to find data in memory cache]" Nov 29 14:50:54 crc kubenswrapper[4907]: I1129 14:50:54.938788 4907 generic.go:334] "Generic (PLEG): container finished" podID="82e47dac-2b81-437c-9cae-05715ba34615" containerID="ebf036c804778e50de4af4602abd5562148a2b93f8da8c6dd577baf9978c623b" exitCode=0 Nov 29 14:50:54 crc kubenswrapper[4907]: I1129 14:50:54.938904 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d7255" event={"ID":"82e47dac-2b81-437c-9cae-05715ba34615","Type":"ContainerDied","Data":"ebf036c804778e50de4af4602abd5562148a2b93f8da8c6dd577baf9978c623b"} Nov 29 14:50:55 crc kubenswrapper[4907]: I1129 14:50:55.653997 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:55 crc kubenswrapper[4907]: I1129 14:50:55.660028 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:55 crc kubenswrapper[4907]: I1129 14:50:55.965152 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Nov 29 14:50:58 crc kubenswrapper[4907]: I1129 14:50:58.490469 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:50:58 crc kubenswrapper[4907]: I1129 14:50:58.491069 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:50:59 crc kubenswrapper[4907]: E1129 14:50:59.808076 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice/crio-e204c0da95eeaedd25ced9e7033f8d1f086de9a44f3410c7c1468fe57ede7599\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice\": RecentStats: unable to find data in memory cache]" Nov 29 14:50:59 crc kubenswrapper[4907]: I1129 14:50:59.833237 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:50:59 crc kubenswrapper[4907]: I1129 14:50:59.910746 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7w4m2"] Nov 29 14:50:59 crc kubenswrapper[4907]: I1129 14:50:59.910972 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-7w4m2" podUID="13de3308-18f0-431d-997a-9288da0f520a" containerName="dnsmasq-dns" containerID="cri-o://e957cbcc59c7dceca9e4424e4cb90793f779bc096698fbcc57511e06c8b7ffc1" gracePeriod=10 Nov 29 14:51:01 crc kubenswrapper[4907]: I1129 14:51:01.019798 4907 generic.go:334] "Generic (PLEG): container finished" podID="13de3308-18f0-431d-997a-9288da0f520a" containerID="e957cbcc59c7dceca9e4424e4cb90793f779bc096698fbcc57511e06c8b7ffc1" exitCode=0 Nov 29 14:51:01 crc kubenswrapper[4907]: I1129 14:51:01.019879 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7w4m2" event={"ID":"13de3308-18f0-431d-997a-9288da0f520a","Type":"ContainerDied","Data":"e957cbcc59c7dceca9e4424e4cb90793f779bc096698fbcc57511e06c8b7ffc1"} Nov 29 14:51:03 crc kubenswrapper[4907]: I1129 14:51:03.045400 4907 generic.go:334] "Generic (PLEG): container finished" podID="8d940ef0-0877-471d-906a-b6235392867d" containerID="99c9a12314c0a1e32c91651ac01ece45aec45412ee520201491f13fb17cc2ac2" exitCode=0 Nov 29 14:51:03 crc kubenswrapper[4907]: I1129 14:51:03.045483 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pwxst" event={"ID":"8d940ef0-0877-471d-906a-b6235392867d","Type":"ContainerDied","Data":"99c9a12314c0a1e32c91651ac01ece45aec45412ee520201491f13fb17cc2ac2"} Nov 29 14:51:03 crc kubenswrapper[4907]: I1129 14:51:03.626150 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-7w4m2" podUID="13de3308-18f0-431d-997a-9288da0f520a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Nov 29 14:51:07 crc kubenswrapper[4907]: E1129 14:51:07.593001 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Nov 29 14:51:07 crc kubenswrapper[4907]: E1129 14:51:07.600695 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5ch686hdch657h79h57hb8h67bh644h58ch56hf6h5d4h59h54bh676h74h586h598h95h5dchdh5b4h567h564h5f5h5b8h5b6hfh68bh5f8h64q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q7vb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(f7c3295c-d537-4302-80c1-ce39f0f4fcb4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.695814 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d7255" Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.807512 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-credential-keys\") pod \"82e47dac-2b81-437c-9cae-05715ba34615\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.807659 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2vvz\" (UniqueName: \"kubernetes.io/projected/82e47dac-2b81-437c-9cae-05715ba34615-kube-api-access-l2vvz\") pod \"82e47dac-2b81-437c-9cae-05715ba34615\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.807742 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-combined-ca-bundle\") pod \"82e47dac-2b81-437c-9cae-05715ba34615\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.807839 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-config-data\") pod \"82e47dac-2b81-437c-9cae-05715ba34615\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.807918 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-fernet-keys\") pod \"82e47dac-2b81-437c-9cae-05715ba34615\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.807981 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-scripts\") pod \"82e47dac-2b81-437c-9cae-05715ba34615\" (UID: \"82e47dac-2b81-437c-9cae-05715ba34615\") " Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.830667 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "82e47dac-2b81-437c-9cae-05715ba34615" (UID: "82e47dac-2b81-437c-9cae-05715ba34615"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.836392 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "82e47dac-2b81-437c-9cae-05715ba34615" (UID: "82e47dac-2b81-437c-9cae-05715ba34615"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.836670 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e47dac-2b81-437c-9cae-05715ba34615-kube-api-access-l2vvz" (OuterVolumeSpecName: "kube-api-access-l2vvz") pod "82e47dac-2b81-437c-9cae-05715ba34615" (UID: "82e47dac-2b81-437c-9cae-05715ba34615"). InnerVolumeSpecName "kube-api-access-l2vvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.876217 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-scripts" (OuterVolumeSpecName: "scripts") pod "82e47dac-2b81-437c-9cae-05715ba34615" (UID: "82e47dac-2b81-437c-9cae-05715ba34615"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.888340 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-config-data" (OuterVolumeSpecName: "config-data") pod "82e47dac-2b81-437c-9cae-05715ba34615" (UID: "82e47dac-2b81-437c-9cae-05715ba34615"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.892908 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82e47dac-2b81-437c-9cae-05715ba34615" (UID: "82e47dac-2b81-437c-9cae-05715ba34615"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.910384 4907 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.910430 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.910539 4907 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.910557 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2vvz\" (UniqueName: \"kubernetes.io/projected/82e47dac-2b81-437c-9cae-05715ba34615-kube-api-access-l2vvz\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.910571 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:07 crc kubenswrapper[4907]: I1129 14:51:07.910581 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e47dac-2b81-437c-9cae-05715ba34615-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.138710 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d7255" event={"ID":"82e47dac-2b81-437c-9cae-05715ba34615","Type":"ContainerDied","Data":"b99e821f18413a606f9207bfe6780ea7bb0c00fc8d2da122bc4b93a3ba227053"} Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.138753 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b99e821f18413a606f9207bfe6780ea7bb0c00fc8d2da122bc4b93a3ba227053" Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.138817 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d7255" Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.626278 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-7w4m2" podUID="13de3308-18f0-431d-997a-9288da0f520a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.790778 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-d7255"] Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.809326 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-d7255"] Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.896394 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-chkqj"] Nov 29 14:51:08 crc kubenswrapper[4907]: E1129 14:51:08.897561 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82e47dac-2b81-437c-9cae-05715ba34615" containerName="keystone-bootstrap" Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.897713 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="82e47dac-2b81-437c-9cae-05715ba34615" containerName="keystone-bootstrap" Nov 29 14:51:08 crc kubenswrapper[4907]: E1129 14:51:08.897912 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df83eea3-3e83-40bc-93e7-0342e4ffd99e" containerName="init" Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.898038 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="df83eea3-3e83-40bc-93e7-0342e4ffd99e" containerName="init" Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.898608 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="82e47dac-2b81-437c-9cae-05715ba34615" containerName="keystone-bootstrap" Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.898778 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="df83eea3-3e83-40bc-93e7-0342e4ffd99e" containerName="init" Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.900268 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.903252 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.903500 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.903749 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vcrpv" Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.905021 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.909886 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-chkqj"] Nov 29 14:51:08 crc kubenswrapper[4907]: I1129 14:51:08.915809 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.031010 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-scripts\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.031116 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-fernet-keys\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.031226 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-config-data\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.031260 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df5rb\" (UniqueName: \"kubernetes.io/projected/a942b3b6-791f-4d18-abdd-113f3372158b-kube-api-access-df5rb\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.031318 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-combined-ca-bundle\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.031380 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-credential-keys\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.133230 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-config-data\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.133293 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df5rb\" (UniqueName: \"kubernetes.io/projected/a942b3b6-791f-4d18-abdd-113f3372158b-kube-api-access-df5rb\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.133356 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-combined-ca-bundle\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.133422 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-credential-keys\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.133528 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-scripts\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.133587 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-fernet-keys\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.139682 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-credential-keys\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.140107 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-scripts\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.139767 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-fernet-keys\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.144909 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-config-data\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.151031 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-combined-ca-bundle\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.160253 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df5rb\" (UniqueName: \"kubernetes.io/projected/a942b3b6-791f-4d18-abdd-113f3372158b-kube-api-access-df5rb\") pod \"keystone-bootstrap-chkqj\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: I1129 14:51:09.225954 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:09 crc kubenswrapper[4907]: E1129 14:51:09.381096 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice/crio-e204c0da95eeaedd25ced9e7033f8d1f086de9a44f3410c7c1468fe57ede7599\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice\": RecentStats: unable to find data in memory cache]" Nov 29 14:51:09 crc kubenswrapper[4907]: E1129 14:51:09.855726 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice/crio-e204c0da95eeaedd25ced9e7033f8d1f086de9a44f3410c7c1468fe57ede7599\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice\": RecentStats: unable to find data in memory cache]" Nov 29 14:51:10 crc kubenswrapper[4907]: I1129 14:51:10.494732 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82e47dac-2b81-437c-9cae-05715ba34615" path="/var/lib/kubelet/pods/82e47dac-2b81-437c-9cae-05715ba34615/volumes" Nov 29 14:51:13 crc kubenswrapper[4907]: I1129 14:51:13.626932 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-7w4m2" podUID="13de3308-18f0-431d-997a-9288da0f520a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Nov 29 14:51:13 crc kubenswrapper[4907]: I1129 14:51:13.627377 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:51:15 crc kubenswrapper[4907]: I1129 14:51:15.242537 4907 generic.go:334] "Generic (PLEG): container finished" podID="a9d771c9-5533-4167-adeb-f77c429ded79" containerID="47ca5682e6e0095166625dcc959b72109edbfc7c8cea1897f03314bc87ac9bf9" exitCode=0 Nov 29 14:51:15 crc kubenswrapper[4907]: I1129 14:51:15.242627 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4r59x" event={"ID":"a9d771c9-5533-4167-adeb-f77c429ded79","Type":"ContainerDied","Data":"47ca5682e6e0095166625dcc959b72109edbfc7c8cea1897f03314bc87ac9bf9"} Nov 29 14:51:17 crc kubenswrapper[4907]: E1129 14:51:17.756626 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Nov 29 14:51:17 crc kubenswrapper[4907]: E1129 14:51:17.758372 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jvrdk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-8lc4r_openstack(d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:51:17 crc kubenswrapper[4907]: E1129 14:51:17.759860 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-8lc4r" podUID="d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de" Nov 29 14:51:17 crc kubenswrapper[4907]: I1129 14:51:17.832769 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pwxst" Nov 29 14:51:17 crc kubenswrapper[4907]: I1129 14:51:17.961394 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-config-data\") pod \"8d940ef0-0877-471d-906a-b6235392867d\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " Nov 29 14:51:17 crc kubenswrapper[4907]: I1129 14:51:17.961574 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rntmt\" (UniqueName: \"kubernetes.io/projected/8d940ef0-0877-471d-906a-b6235392867d-kube-api-access-rntmt\") pod \"8d940ef0-0877-471d-906a-b6235392867d\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " Nov 29 14:51:17 crc kubenswrapper[4907]: I1129 14:51:17.962656 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-combined-ca-bundle\") pod \"8d940ef0-0877-471d-906a-b6235392867d\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " Nov 29 14:51:17 crc kubenswrapper[4907]: I1129 14:51:17.962731 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-db-sync-config-data\") pod \"8d940ef0-0877-471d-906a-b6235392867d\" (UID: \"8d940ef0-0877-471d-906a-b6235392867d\") " Nov 29 14:51:17 crc kubenswrapper[4907]: I1129 14:51:17.970731 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d940ef0-0877-471d-906a-b6235392867d-kube-api-access-rntmt" (OuterVolumeSpecName: "kube-api-access-rntmt") pod "8d940ef0-0877-471d-906a-b6235392867d" (UID: "8d940ef0-0877-471d-906a-b6235392867d"). InnerVolumeSpecName "kube-api-access-rntmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:17 crc kubenswrapper[4907]: I1129 14:51:17.982692 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8d940ef0-0877-471d-906a-b6235392867d" (UID: "8d940ef0-0877-471d-906a-b6235392867d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.014435 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d940ef0-0877-471d-906a-b6235392867d" (UID: "8d940ef0-0877-471d-906a-b6235392867d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.052694 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-config-data" (OuterVolumeSpecName: "config-data") pod "8d940ef0-0877-471d-906a-b6235392867d" (UID: "8d940ef0-0877-471d-906a-b6235392867d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.065592 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.065634 4907 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.065648 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d940ef0-0877-471d-906a-b6235392867d-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.065660 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rntmt\" (UniqueName: \"kubernetes.io/projected/8d940ef0-0877-471d-906a-b6235392867d-kube-api-access-rntmt\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:18 crc kubenswrapper[4907]: E1129 14:51:18.240246 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Nov 29 14:51:18 crc kubenswrapper[4907]: E1129 14:51:18.240392 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9ltf5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-5vflp_openstack(1e3e611f-5e4c-4b2c-baea-5f74745f315b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:51:18 crc kubenswrapper[4907]: E1129 14:51:18.241564 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-5vflp" podUID="1e3e611f-5e4c-4b2c-baea-5f74745f315b" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.244625 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4r59x" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.261981 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.271770 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lhb9\" (UniqueName: \"kubernetes.io/projected/13de3308-18f0-431d-997a-9288da0f520a-kube-api-access-5lhb9\") pod \"13de3308-18f0-431d-997a-9288da0f520a\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.271855 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-ovsdbserver-nb\") pod \"13de3308-18f0-431d-997a-9288da0f520a\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.271930 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-config\") pod \"13de3308-18f0-431d-997a-9288da0f520a\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.272003 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx5k8\" (UniqueName: \"kubernetes.io/projected/a9d771c9-5533-4167-adeb-f77c429ded79-kube-api-access-zx5k8\") pod \"a9d771c9-5533-4167-adeb-f77c429ded79\" (UID: \"a9d771c9-5533-4167-adeb-f77c429ded79\") " Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.272032 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9d771c9-5533-4167-adeb-f77c429ded79-config\") pod \"a9d771c9-5533-4167-adeb-f77c429ded79\" (UID: \"a9d771c9-5533-4167-adeb-f77c429ded79\") " Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.272095 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-dns-svc\") pod \"13de3308-18f0-431d-997a-9288da0f520a\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.272134 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d771c9-5533-4167-adeb-f77c429ded79-combined-ca-bundle\") pod \"a9d771c9-5533-4167-adeb-f77c429ded79\" (UID: \"a9d771c9-5533-4167-adeb-f77c429ded79\") " Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.272169 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-ovsdbserver-sb\") pod \"13de3308-18f0-431d-997a-9288da0f520a\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.280700 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4r59x" event={"ID":"a9d771c9-5533-4167-adeb-f77c429ded79","Type":"ContainerDied","Data":"0e6c94f2b855540ddfda8a8d6971066a278f3d69e3ef8e3619621f126bed0f8f"} Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.280861 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4r59x" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.283738 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e6c94f2b855540ddfda8a8d6971066a278f3d69e3ef8e3619621f126bed0f8f" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.284240 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9d771c9-5533-4167-adeb-f77c429ded79-kube-api-access-zx5k8" (OuterVolumeSpecName: "kube-api-access-zx5k8") pod "a9d771c9-5533-4167-adeb-f77c429ded79" (UID: "a9d771c9-5533-4167-adeb-f77c429ded79"). InnerVolumeSpecName "kube-api-access-zx5k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.310716 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7w4m2" event={"ID":"13de3308-18f0-431d-997a-9288da0f520a","Type":"ContainerDied","Data":"057103142cf0ea736a459fb701582bac34c39878ab808c5c6fc1e633ce633536"} Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.310953 4907 scope.go:117] "RemoveContainer" containerID="e957cbcc59c7dceca9e4424e4cb90793f779bc096698fbcc57511e06c8b7ffc1" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.310980 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13de3308-18f0-431d-997a-9288da0f520a-kube-api-access-5lhb9" (OuterVolumeSpecName: "kube-api-access-5lhb9") pod "13de3308-18f0-431d-997a-9288da0f520a" (UID: "13de3308-18f0-431d-997a-9288da0f520a"). InnerVolumeSpecName "kube-api-access-5lhb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.311103 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7w4m2" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.317729 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pwxst" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.317762 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pwxst" event={"ID":"8d940ef0-0877-471d-906a-b6235392867d","Type":"ContainerDied","Data":"6847b8265f02d597825b921ad787455802820d570bb7f68a57727149a8050b35"} Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.317797 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6847b8265f02d597825b921ad787455802820d570bb7f68a57727149a8050b35" Nov 29 14:51:18 crc kubenswrapper[4907]: E1129 14:51:18.320592 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-5vflp" podUID="1e3e611f-5e4c-4b2c-baea-5f74745f315b" Nov 29 14:51:18 crc kubenswrapper[4907]: E1129 14:51:18.320721 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-8lc4r" podUID="d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.325428 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d771c9-5533-4167-adeb-f77c429ded79-config" (OuterVolumeSpecName: "config") pod "a9d771c9-5533-4167-adeb-f77c429ded79" (UID: "a9d771c9-5533-4167-adeb-f77c429ded79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.331488 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "13de3308-18f0-431d-997a-9288da0f520a" (UID: "13de3308-18f0-431d-997a-9288da0f520a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.374588 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-config" (OuterVolumeSpecName: "config") pod "13de3308-18f0-431d-997a-9288da0f520a" (UID: "13de3308-18f0-431d-997a-9288da0f520a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.375231 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-config\") pod \"13de3308-18f0-431d-997a-9288da0f520a\" (UID: \"13de3308-18f0-431d-997a-9288da0f520a\") " Nov 29 14:51:18 crc kubenswrapper[4907]: W1129 14:51:18.375916 4907 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/13de3308-18f0-431d-997a-9288da0f520a/volumes/kubernetes.io~configmap/config Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.375930 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-config" (OuterVolumeSpecName: "config") pod "13de3308-18f0-431d-997a-9288da0f520a" (UID: "13de3308-18f0-431d-997a-9288da0f520a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.376883 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lhb9\" (UniqueName: \"kubernetes.io/projected/13de3308-18f0-431d-997a-9288da0f520a-kube-api-access-5lhb9\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.376901 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.376909 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.376930 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx5k8\" (UniqueName: \"kubernetes.io/projected/a9d771c9-5533-4167-adeb-f77c429ded79-kube-api-access-zx5k8\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.376939 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9d771c9-5533-4167-adeb-f77c429ded79-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.377055 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d771c9-5533-4167-adeb-f77c429ded79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9d771c9-5533-4167-adeb-f77c429ded79" (UID: "a9d771c9-5533-4167-adeb-f77c429ded79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.432356 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "13de3308-18f0-431d-997a-9288da0f520a" (UID: "13de3308-18f0-431d-997a-9288da0f520a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.432666 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "13de3308-18f0-431d-997a-9288da0f520a" (UID: "13de3308-18f0-431d-997a-9288da0f520a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.478136 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d771c9-5533-4167-adeb-f77c429ded79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.478185 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.478196 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13de3308-18f0-431d-997a-9288da0f520a-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.639345 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7w4m2"] Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.648083 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7w4m2"] Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.687461 4907 scope.go:117] "RemoveContainer" containerID="1876b9c8bc61ee8b5e89e654733f2b34b0e78c02cc8a8d312893b15e282bc3b4" Nov 29 14:51:18 crc kubenswrapper[4907]: I1129 14:51:18.815913 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-chkqj"] Nov 29 14:51:18 crc kubenswrapper[4907]: W1129 14:51:18.821053 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda942b3b6_791f_4d18_abdd_113f3372158b.slice/crio-4de953567c16603e227665c74673900f498265ca30fe73ba6963998328de5d7a WatchSource:0}: Error finding container 4de953567c16603e227665c74673900f498265ca30fe73ba6963998328de5d7a: Status 404 returned error can't find the container with id 4de953567c16603e227665c74673900f498265ca30fe73ba6963998328de5d7a Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.321258 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-sskgd"] Nov 29 14:51:19 crc kubenswrapper[4907]: E1129 14:51:19.322073 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13de3308-18f0-431d-997a-9288da0f520a" containerName="dnsmasq-dns" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.322087 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="13de3308-18f0-431d-997a-9288da0f520a" containerName="dnsmasq-dns" Nov 29 14:51:19 crc kubenswrapper[4907]: E1129 14:51:19.322127 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13de3308-18f0-431d-997a-9288da0f520a" containerName="init" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.322135 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="13de3308-18f0-431d-997a-9288da0f520a" containerName="init" Nov 29 14:51:19 crc kubenswrapper[4907]: E1129 14:51:19.322162 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d940ef0-0877-471d-906a-b6235392867d" containerName="glance-db-sync" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.322168 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d940ef0-0877-471d-906a-b6235392867d" containerName="glance-db-sync" Nov 29 14:51:19 crc kubenswrapper[4907]: E1129 14:51:19.322217 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d771c9-5533-4167-adeb-f77c429ded79" containerName="neutron-db-sync" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.322223 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d771c9-5533-4167-adeb-f77c429ded79" containerName="neutron-db-sync" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.322476 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d940ef0-0877-471d-906a-b6235392867d" containerName="glance-db-sync" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.325122 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="13de3308-18f0-431d-997a-9288da0f520a" containerName="dnsmasq-dns" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.325148 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9d771c9-5533-4167-adeb-f77c429ded79" containerName="neutron-db-sync" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.326649 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.348336 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-chkqj" event={"ID":"a942b3b6-791f-4d18-abdd-113f3372158b","Type":"ContainerStarted","Data":"6f3c544987c7090c1b6dc149e2e9b8dfd7e51ae1a09aafdfef7b6b69d3838678"} Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.348378 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-chkqj" event={"ID":"a942b3b6-791f-4d18-abdd-113f3372158b","Type":"ContainerStarted","Data":"4de953567c16603e227665c74673900f498265ca30fe73ba6963998328de5d7a"} Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.352083 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-sskgd"] Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.363251 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7c3295c-d537-4302-80c1-ce39f0f4fcb4","Type":"ContainerStarted","Data":"5bb03f96ea58f6710f35fa0e0f2b34eaf307facac712b36e63bcc71d9ab89995"} Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.364798 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6csw2" event={"ID":"de06d815-0165-4c7a-aeed-fda3a647ba27","Type":"ContainerStarted","Data":"63156ddbddad132ef95bfee0567526bc1dfc5f35237a842109c35d2633f01127"} Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.384836 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-chkqj" podStartSLOduration=11.384818251 podStartE2EDuration="11.384818251s" podCreationTimestamp="2025-11-29 14:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:19.384757559 +0000 UTC m=+1377.371595211" watchObservedRunningTime="2025-11-29 14:51:19.384818251 +0000 UTC m=+1377.371655903" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.385213 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-92jsx" event={"ID":"0089fab2-d07c-4dad-bce1-a4c085a35d24","Type":"ContainerStarted","Data":"0bd911ebf8c3afdfa355fd7cbb762bd98e178dc4b1647485e73217b0b00edbab"} Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.407500 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"8b4609569b62c2e44d946dc9af65d8555ae591e944b924531d07870bba7df8bf"} Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.430682 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-6csw2" podStartSLOduration=4.8440815090000005 podStartE2EDuration="35.430654187s" podCreationTimestamp="2025-11-29 14:50:44 +0000 UTC" firstStartedPulling="2025-11-29 14:50:47.662427401 +0000 UTC m=+1345.649265053" lastFinishedPulling="2025-11-29 14:51:18.249000079 +0000 UTC m=+1376.235837731" observedRunningTime="2025-11-29 14:51:19.406033976 +0000 UTC m=+1377.392871618" watchObservedRunningTime="2025-11-29 14:51:19.430654187 +0000 UTC m=+1377.417491839" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.477990 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-92jsx" podStartSLOduration=4.3707882 podStartE2EDuration="35.477970766s" podCreationTimestamp="2025-11-29 14:50:44 +0000 UTC" firstStartedPulling="2025-11-29 14:50:47.141821633 +0000 UTC m=+1345.128659285" lastFinishedPulling="2025-11-29 14:51:18.249004199 +0000 UTC m=+1376.235841851" observedRunningTime="2025-11-29 14:51:19.427786986 +0000 UTC m=+1377.414624638" watchObservedRunningTime="2025-11-29 14:51:19.477970766 +0000 UTC m=+1377.464808418" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.504832 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-config\") pod \"dnsmasq-dns-f84976bdf-sskgd\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.504907 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-sskgd\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.504931 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-sskgd\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.505057 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-dns-svc\") pod \"dnsmasq-dns-f84976bdf-sskgd\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.505191 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b528m\" (UniqueName: \"kubernetes.io/projected/8ec2d088-4a81-4692-b579-7179c5df5489-kube-api-access-b528m\") pod \"dnsmasq-dns-f84976bdf-sskgd\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.614797 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b528m\" (UniqueName: \"kubernetes.io/projected/8ec2d088-4a81-4692-b579-7179c5df5489-kube-api-access-b528m\") pod \"dnsmasq-dns-f84976bdf-sskgd\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.614918 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-config\") pod \"dnsmasq-dns-f84976bdf-sskgd\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.614958 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-sskgd\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.614977 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-sskgd\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.615006 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-dns-svc\") pod \"dnsmasq-dns-f84976bdf-sskgd\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.616259 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-sskgd"] Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.616476 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-config\") pod \"dnsmasq-dns-f84976bdf-sskgd\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: E1129 14:51:19.617228 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-b528m ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-f84976bdf-sskgd" podUID="8ec2d088-4a81-4692-b579-7179c5df5489" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.622843 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-sskgd\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.632516 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-dns-svc\") pod \"dnsmasq-dns-f84976bdf-sskgd\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.632774 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-sskgd\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.637360 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fb745b69-s8h6d"] Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.642281 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.679258 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b528m\" (UniqueName: \"kubernetes.io/projected/8ec2d088-4a81-4692-b579-7179c5df5489-kube-api-access-b528m\") pod \"dnsmasq-dns-f84976bdf-sskgd\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.683132 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-s8h6d"] Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.820727 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8klw\" (UniqueName: \"kubernetes.io/projected/07213bf3-24a2-492a-8094-21b774bb7b97-kube-api-access-x8klw\") pod \"dnsmasq-dns-fb745b69-s8h6d\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.821098 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-s8h6d\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.821117 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-s8h6d\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.821137 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-config\") pod \"dnsmasq-dns-fb745b69-s8h6d\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.821186 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-dns-svc\") pod \"dnsmasq-dns-fb745b69-s8h6d\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.822712 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dfc6d84d8-nk2pb"] Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.824991 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.829452 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.829633 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.829797 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.830454 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rrjq4" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.870729 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dfc6d84d8-nk2pb"] Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.923542 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-httpd-config\") pod \"neutron-dfc6d84d8-nk2pb\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.923588 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-combined-ca-bundle\") pod \"neutron-dfc6d84d8-nk2pb\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.923632 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p45gd\" (UniqueName: \"kubernetes.io/projected/f3eac663-661d-4bdb-bf65-3d92c9019225-kube-api-access-p45gd\") pod \"neutron-dfc6d84d8-nk2pb\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.923666 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-s8h6d\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.923701 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-s8h6d\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.923721 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-config\") pod \"dnsmasq-dns-fb745b69-s8h6d\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.923745 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-ovndb-tls-certs\") pod \"neutron-dfc6d84d8-nk2pb\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.923799 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-dns-svc\") pod \"dnsmasq-dns-fb745b69-s8h6d\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.923849 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8klw\" (UniqueName: \"kubernetes.io/projected/07213bf3-24a2-492a-8094-21b774bb7b97-kube-api-access-x8klw\") pod \"dnsmasq-dns-fb745b69-s8h6d\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.923926 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-config\") pod \"neutron-dfc6d84d8-nk2pb\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.924473 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-s8h6d\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.924781 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-config\") pod \"dnsmasq-dns-fb745b69-s8h6d\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.924974 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-dns-svc\") pod \"dnsmasq-dns-fb745b69-s8h6d\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.925500 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-s8h6d\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.956864 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8klw\" (UniqueName: \"kubernetes.io/projected/07213bf3-24a2-492a-8094-21b774bb7b97-kube-api-access-x8klw\") pod \"dnsmasq-dns-fb745b69-s8h6d\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:19 crc kubenswrapper[4907]: I1129 14:51:19.986689 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.025752 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-config\") pod \"neutron-dfc6d84d8-nk2pb\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.025868 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-httpd-config\") pod \"neutron-dfc6d84d8-nk2pb\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.025894 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-combined-ca-bundle\") pod \"neutron-dfc6d84d8-nk2pb\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.025919 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p45gd\" (UniqueName: \"kubernetes.io/projected/f3eac663-661d-4bdb-bf65-3d92c9019225-kube-api-access-p45gd\") pod \"neutron-dfc6d84d8-nk2pb\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.025953 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-ovndb-tls-certs\") pod \"neutron-dfc6d84d8-nk2pb\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.036079 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-httpd-config\") pod \"neutron-dfc6d84d8-nk2pb\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.047339 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-combined-ca-bundle\") pod \"neutron-dfc6d84d8-nk2pb\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.051773 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-ovndb-tls-certs\") pod \"neutron-dfc6d84d8-nk2pb\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.052143 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-config\") pod \"neutron-dfc6d84d8-nk2pb\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.076098 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p45gd\" (UniqueName: \"kubernetes.io/projected/f3eac663-661d-4bdb-bf65-3d92c9019225-kube-api-access-p45gd\") pod \"neutron-dfc6d84d8-nk2pb\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.168358 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.220243 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 14:51:20 crc kubenswrapper[4907]: E1129 14:51:20.231298 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice/crio-e204c0da95eeaedd25ced9e7033f8d1f086de9a44f3410c7c1468fe57ede7599\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bad7a6_9301_4f9e_8303_ae377c4f909f.slice\": RecentStats: unable to find data in memory cache]" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.241261 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.252420 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.252751 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4qqtq" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.252750 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.275202 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.342476 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-scripts\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.342831 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fe218c8-f724-425b-ad67-d5ac967bc0c9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.342931 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-config-data\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.343005 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.343044 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.343729 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd2gr\" (UniqueName: \"kubernetes.io/projected/4fe218c8-f724-425b-ad67-d5ac967bc0c9-kube-api-access-qd2gr\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.343777 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fe218c8-f724-425b-ad67-d5ac967bc0c9-logs\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.445473 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.445558 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd2gr\" (UniqueName: \"kubernetes.io/projected/4fe218c8-f724-425b-ad67-d5ac967bc0c9-kube-api-access-qd2gr\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.445585 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fe218c8-f724-425b-ad67-d5ac967bc0c9-logs\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.445668 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-scripts\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.445696 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fe218c8-f724-425b-ad67-d5ac967bc0c9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.445752 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-config-data\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.445805 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.445871 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.449873 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fe218c8-f724-425b-ad67-d5ac967bc0c9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.449913 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fe218c8-f724-425b-ad67-d5ac967bc0c9-logs\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.451323 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.452003 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"77fd8dce9d57da53bf31ecef4dd0f31d01ba566a7ecec65e51d5a7c82a110973"} Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.454112 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.455491 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-config-data\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.455723 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-scripts\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.466380 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.476196 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd2gr\" (UniqueName: \"kubernetes.io/projected/4fe218c8-f724-425b-ad67-d5ac967bc0c9-kube-api-access-qd2gr\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.488184 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.497577 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13de3308-18f0-431d-997a-9288da0f520a" path="/var/lib/kubelet/pods/13de3308-18f0-431d-997a-9288da0f520a/volumes" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.503555 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.506207 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.506422 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.508599 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.547175 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b528m\" (UniqueName: \"kubernetes.io/projected/8ec2d088-4a81-4692-b579-7179c5df5489-kube-api-access-b528m\") pod \"8ec2d088-4a81-4692-b579-7179c5df5489\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.548006 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-ovsdbserver-nb\") pod \"8ec2d088-4a81-4692-b579-7179c5df5489\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.548525 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-config\") pod \"8ec2d088-4a81-4692-b579-7179c5df5489\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.548663 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-dns-svc\") pod \"8ec2d088-4a81-4692-b579-7179c5df5489\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.549206 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-ovsdbserver-sb\") pod \"8ec2d088-4a81-4692-b579-7179c5df5489\" (UID: \"8ec2d088-4a81-4692-b579-7179c5df5489\") " Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.548547 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8ec2d088-4a81-4692-b579-7179c5df5489" (UID: "8ec2d088-4a81-4692-b579-7179c5df5489"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.548765 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-config" (OuterVolumeSpecName: "config") pod "8ec2d088-4a81-4692-b579-7179c5df5489" (UID: "8ec2d088-4a81-4692-b579-7179c5df5489"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.549115 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8ec2d088-4a81-4692-b579-7179c5df5489" (UID: "8ec2d088-4a81-4692-b579-7179c5df5489"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.549684 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8ec2d088-4a81-4692-b579-7179c5df5489" (UID: "8ec2d088-4a81-4692-b579-7179c5df5489"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.551145 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/253433fc-8f60-4aea-af99-a25c2e872cb4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.551695 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253433fc-8f60-4aea-af99-a25c2e872cb4-logs\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.552252 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.552431 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.552618 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.552799 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.553029 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q69v\" (UniqueName: \"kubernetes.io/projected/253433fc-8f60-4aea-af99-a25c2e872cb4-kube-api-access-8q69v\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.553752 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.554356 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.554510 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.554605 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ec2d088-4a81-4692-b579-7179c5df5489-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.554875 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ec2d088-4a81-4692-b579-7179c5df5489-kube-api-access-b528m" (OuterVolumeSpecName: "kube-api-access-b528m") pod "8ec2d088-4a81-4692-b579-7179c5df5489" (UID: "8ec2d088-4a81-4692-b579-7179c5df5489"). InnerVolumeSpecName "kube-api-access-b528m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.632968 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.656221 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q69v\" (UniqueName: \"kubernetes.io/projected/253433fc-8f60-4aea-af99-a25c2e872cb4-kube-api-access-8q69v\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.656339 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/253433fc-8f60-4aea-af99-a25c2e872cb4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.656392 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253433fc-8f60-4aea-af99-a25c2e872cb4-logs\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.656469 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.656522 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.656550 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.656588 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.656654 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b528m\" (UniqueName: \"kubernetes.io/projected/8ec2d088-4a81-4692-b579-7179c5df5489-kube-api-access-b528m\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.656782 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.657286 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253433fc-8f60-4aea-af99-a25c2e872cb4-logs\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.657866 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/253433fc-8f60-4aea-af99-a25c2e872cb4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.670219 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.674833 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.683167 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.683411 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q69v\" (UniqueName: \"kubernetes.io/projected/253433fc-8f60-4aea-af99-a25c2e872cb4-kube-api-access-8q69v\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.696347 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-s8h6d"] Nov 29 14:51:20 crc kubenswrapper[4907]: W1129 14:51:20.706732 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07213bf3_24a2_492a_8094_21b774bb7b97.slice/crio-3847bd34974ba05b87b571ccba6d1ee6151a8b2c129f649dfaa7509e197c26e2 WatchSource:0}: Error finding container 3847bd34974ba05b87b571ccba6d1ee6151a8b2c129f649dfaa7509e197c26e2: Status 404 returned error can't find the container with id 3847bd34974ba05b87b571ccba6d1ee6151a8b2c129f649dfaa7509e197c26e2 Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.779706 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:20 crc kubenswrapper[4907]: I1129 14:51:20.828512 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:21 crc kubenswrapper[4907]: I1129 14:51:21.073860 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dfc6d84d8-nk2pb"] Nov 29 14:51:21 crc kubenswrapper[4907]: I1129 14:51:21.484242 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dfc6d84d8-nk2pb" event={"ID":"f3eac663-661d-4bdb-bf65-3d92c9019225","Type":"ContainerStarted","Data":"f10d0f544f72f6859d5df15e2c8b30d0815a25da7feda0a5ad83a0c0d9995853"} Nov 29 14:51:21 crc kubenswrapper[4907]: I1129 14:51:21.485198 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-s8h6d" event={"ID":"07213bf3-24a2-492a-8094-21b774bb7b97","Type":"ContainerStarted","Data":"3847bd34974ba05b87b571ccba6d1ee6151a8b2c129f649dfaa7509e197c26e2"} Nov 29 14:51:21 crc kubenswrapper[4907]: I1129 14:51:21.485268 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-sskgd" Nov 29 14:51:21 crc kubenswrapper[4907]: I1129 14:51:21.509697 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 14:51:21 crc kubenswrapper[4907]: I1129 14:51:21.571929 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-sskgd"] Nov 29 14:51:21 crc kubenswrapper[4907]: I1129 14:51:21.583612 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-sskgd"] Nov 29 14:51:21 crc kubenswrapper[4907]: I1129 14:51:21.871930 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 14:51:21 crc kubenswrapper[4907]: W1129 14:51:21.876264 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fe218c8_f724_425b_ad67_d5ac967bc0c9.slice/crio-9c94e1df360ece51c75b1fb63c6259c6d9c0e53d55e78bd3d7a8dff0e05b5e4b WatchSource:0}: Error finding container 9c94e1df360ece51c75b1fb63c6259c6d9c0e53d55e78bd3d7a8dff0e05b5e4b: Status 404 returned error can't find the container with id 9c94e1df360ece51c75b1fb63c6259c6d9c0e53d55e78bd3d7a8dff0e05b5e4b Nov 29 14:51:22 crc kubenswrapper[4907]: I1129 14:51:22.499734 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ec2d088-4a81-4692-b579-7179c5df5489" path="/var/lib/kubelet/pods/8ec2d088-4a81-4692-b579-7179c5df5489/volumes" Nov 29 14:51:22 crc kubenswrapper[4907]: I1129 14:51:22.514106 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"0302b2f3b8ab753b14dd5186d2e679910a5df057135aaac09f15a5ccc42ec8da"} Nov 29 14:51:22 crc kubenswrapper[4907]: I1129 14:51:22.518055 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"253433fc-8f60-4aea-af99-a25c2e872cb4","Type":"ContainerStarted","Data":"6a463b8ae5923b0f5fdde1deb87f91f3c241ec9d636f48fa45fbc2c82fcd1051"} Nov 29 14:51:22 crc kubenswrapper[4907]: I1129 14:51:22.520776 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fe218c8-f724-425b-ad67-d5ac967bc0c9","Type":"ContainerStarted","Data":"9c94e1df360ece51c75b1fb63c6259c6d9c0e53d55e78bd3d7a8dff0e05b5e4b"} Nov 29 14:51:22 crc kubenswrapper[4907]: I1129 14:51:22.522267 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dfc6d84d8-nk2pb" event={"ID":"f3eac663-661d-4bdb-bf65-3d92c9019225","Type":"ContainerStarted","Data":"8fdcdd1b28e6cdbe5dad885dff079712be2bfe4635dd08d1555aa547006ff028"} Nov 29 14:51:22 crc kubenswrapper[4907]: I1129 14:51:22.523331 4907 generic.go:334] "Generic (PLEG): container finished" podID="07213bf3-24a2-492a-8094-21b774bb7b97" containerID="cac75ed3a6ff16fc4b44777d7cb5db755c231a7da24efa3c9e860cc0942554b2" exitCode=0 Nov 29 14:51:22 crc kubenswrapper[4907]: I1129 14:51:22.523358 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-s8h6d" event={"ID":"07213bf3-24a2-492a-8094-21b774bb7b97","Type":"ContainerDied","Data":"cac75ed3a6ff16fc4b44777d7cb5db755c231a7da24efa3c9e860cc0942554b2"} Nov 29 14:51:23 crc kubenswrapper[4907]: I1129 14:51:23.549211 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fe218c8-f724-425b-ad67-d5ac967bc0c9","Type":"ContainerStarted","Data":"3d90897aa591adced08fec0122cbdf378b551fdabbf671aac4d6664ca433bfe3"} Nov 29 14:51:23 crc kubenswrapper[4907]: I1129 14:51:23.554100 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dfc6d84d8-nk2pb" event={"ID":"f3eac663-661d-4bdb-bf65-3d92c9019225","Type":"ContainerStarted","Data":"7e8cbc93d8d2f69d58bda5141aed2f4b91f0365b241ea68c2b3ff9697d99b781"} Nov 29 14:51:23 crc kubenswrapper[4907]: I1129 14:51:23.554273 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:23 crc kubenswrapper[4907]: I1129 14:51:23.557718 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-s8h6d" event={"ID":"07213bf3-24a2-492a-8094-21b774bb7b97","Type":"ContainerStarted","Data":"4d567e1378c65706be58e4b02188fc84061792e5e0cb65d989fc6545e0a5f7af"} Nov 29 14:51:23 crc kubenswrapper[4907]: I1129 14:51:23.557753 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:23 crc kubenswrapper[4907]: I1129 14:51:23.561733 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"dc430b6213522ea592dca6451a43bdee2ae477fe2683fe1d8d0e942961310ca6"} Nov 29 14:51:23 crc kubenswrapper[4907]: I1129 14:51:23.576915 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dfc6d84d8-nk2pb" podStartSLOduration=4.5768989300000005 podStartE2EDuration="4.57689893s" podCreationTimestamp="2025-11-29 14:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:23.572760742 +0000 UTC m=+1381.559598394" watchObservedRunningTime="2025-11-29 14:51:23.57689893 +0000 UTC m=+1381.563736582" Nov 29 14:51:23 crc kubenswrapper[4907]: I1129 14:51:23.585751 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"253433fc-8f60-4aea-af99-a25c2e872cb4","Type":"ContainerStarted","Data":"74b57b043ee7cd71c9ccadd2d4b333fb7cef69afb6b5a49d9c4ad0cbe8119b47"} Nov 29 14:51:23 crc kubenswrapper[4907]: I1129 14:51:23.658516 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fb745b69-s8h6d" podStartSLOduration=4.658495705 podStartE2EDuration="4.658495705s" podCreationTimestamp="2025-11-29 14:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:23.595181371 +0000 UTC m=+1381.582019023" watchObservedRunningTime="2025-11-29 14:51:23.658495705 +0000 UTC m=+1381.645333357" Nov 29 14:51:23 crc kubenswrapper[4907]: I1129 14:51:23.825864 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 14:51:23 crc kubenswrapper[4907]: I1129 14:51:23.858540 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 14:51:24 crc kubenswrapper[4907]: I1129 14:51:24.602302 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"253433fc-8f60-4aea-af99-a25c2e872cb4","Type":"ContainerStarted","Data":"3617d73d27bc485d3d7b3b04b850b212caee2729d5f66c2fdeecbf7a2305ec10"} Nov 29 14:51:24 crc kubenswrapper[4907]: I1129 14:51:24.602398 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="253433fc-8f60-4aea-af99-a25c2e872cb4" containerName="glance-log" containerID="cri-o://74b57b043ee7cd71c9ccadd2d4b333fb7cef69afb6b5a49d9c4ad0cbe8119b47" gracePeriod=30 Nov 29 14:51:24 crc kubenswrapper[4907]: I1129 14:51:24.602454 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="253433fc-8f60-4aea-af99-a25c2e872cb4" containerName="glance-httpd" containerID="cri-o://3617d73d27bc485d3d7b3b04b850b212caee2729d5f66c2fdeecbf7a2305ec10" gracePeriod=30 Nov 29 14:51:24 crc kubenswrapper[4907]: I1129 14:51:24.611018 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fe218c8-f724-425b-ad67-d5ac967bc0c9","Type":"ContainerStarted","Data":"f188464360dfa55fb1eda709a739e3147748149d0077a49ff10b5910f0bdc9c1"} Nov 29 14:51:24 crc kubenswrapper[4907]: I1129 14:51:24.611161 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4fe218c8-f724-425b-ad67-d5ac967bc0c9" containerName="glance-log" containerID="cri-o://3d90897aa591adced08fec0122cbdf378b551fdabbf671aac4d6664ca433bfe3" gracePeriod=30 Nov 29 14:51:24 crc kubenswrapper[4907]: I1129 14:51:24.611278 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4fe218c8-f724-425b-ad67-d5ac967bc0c9" containerName="glance-httpd" containerID="cri-o://f188464360dfa55fb1eda709a739e3147748149d0077a49ff10b5910f0bdc9c1" gracePeriod=30 Nov 29 14:51:24 crc kubenswrapper[4907]: I1129 14:51:24.654955 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.6549307540000004 podStartE2EDuration="5.654930754s" podCreationTimestamp="2025-11-29 14:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:24.636225041 +0000 UTC m=+1382.623062693" watchObservedRunningTime="2025-11-29 14:51:24.654930754 +0000 UTC m=+1382.641768406" Nov 29 14:51:24 crc kubenswrapper[4907]: I1129 14:51:24.674979 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.674954565 podStartE2EDuration="5.674954565s" podCreationTimestamp="2025-11-29 14:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:24.658152436 +0000 UTC m=+1382.644990098" watchObservedRunningTime="2025-11-29 14:51:24.674954565 +0000 UTC m=+1382.661792217" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.637308 4907 generic.go:334] "Generic (PLEG): container finished" podID="253433fc-8f60-4aea-af99-a25c2e872cb4" containerID="74b57b043ee7cd71c9ccadd2d4b333fb7cef69afb6b5a49d9c4ad0cbe8119b47" exitCode=143 Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.637381 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"253433fc-8f60-4aea-af99-a25c2e872cb4","Type":"ContainerDied","Data":"74b57b043ee7cd71c9ccadd2d4b333fb7cef69afb6b5a49d9c4ad0cbe8119b47"} Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.641184 4907 generic.go:334] "Generic (PLEG): container finished" podID="4fe218c8-f724-425b-ad67-d5ac967bc0c9" containerID="3d90897aa591adced08fec0122cbdf378b551fdabbf671aac4d6664ca433bfe3" exitCode=143 Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.641241 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fe218c8-f724-425b-ad67-d5ac967bc0c9","Type":"ContainerDied","Data":"3d90897aa591adced08fec0122cbdf378b551fdabbf671aac4d6664ca433bfe3"} Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.719914 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-68c5f6d545-tlmv5"] Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.722492 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.727370 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.727727 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.733335 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68c5f6d545-tlmv5"] Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.803365 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-public-tls-certs\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.803456 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-ovndb-tls-certs\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.803587 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-config\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.803963 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vqkj\" (UniqueName: \"kubernetes.io/projected/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-kube-api-access-9vqkj\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.804102 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-combined-ca-bundle\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.804261 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-httpd-config\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.804404 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-internal-tls-certs\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.906617 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-combined-ca-bundle\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.906680 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-httpd-config\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.906716 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-internal-tls-certs\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.906736 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-public-tls-certs\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.906770 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-ovndb-tls-certs\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.906804 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-config\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.906894 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vqkj\" (UniqueName: \"kubernetes.io/projected/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-kube-api-access-9vqkj\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.912697 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-httpd-config\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.914735 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-config\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.914958 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-internal-tls-certs\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.915808 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-ovndb-tls-certs\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.916175 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-public-tls-certs\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.931069 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-combined-ca-bundle\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:25 crc kubenswrapper[4907]: I1129 14:51:25.947404 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vqkj\" (UniqueName: \"kubernetes.io/projected/e814e290-11f9-48bc-9f3d-36aeecf0ec1a-kube-api-access-9vqkj\") pod \"neutron-68c5f6d545-tlmv5\" (UID: \"e814e290-11f9-48bc-9f3d-36aeecf0ec1a\") " pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:26 crc kubenswrapper[4907]: I1129 14:51:26.056800 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:26 crc kubenswrapper[4907]: I1129 14:51:26.656229 4907 generic.go:334] "Generic (PLEG): container finished" podID="253433fc-8f60-4aea-af99-a25c2e872cb4" containerID="3617d73d27bc485d3d7b3b04b850b212caee2729d5f66c2fdeecbf7a2305ec10" exitCode=0 Nov 29 14:51:26 crc kubenswrapper[4907]: I1129 14:51:26.656715 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"253433fc-8f60-4aea-af99-a25c2e872cb4","Type":"ContainerDied","Data":"3617d73d27bc485d3d7b3b04b850b212caee2729d5f66c2fdeecbf7a2305ec10"} Nov 29 14:51:26 crc kubenswrapper[4907]: I1129 14:51:26.662858 4907 generic.go:334] "Generic (PLEG): container finished" podID="4fe218c8-f724-425b-ad67-d5ac967bc0c9" containerID="f188464360dfa55fb1eda709a739e3147748149d0077a49ff10b5910f0bdc9c1" exitCode=0 Nov 29 14:51:26 crc kubenswrapper[4907]: I1129 14:51:26.662921 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fe218c8-f724-425b-ad67-d5ac967bc0c9","Type":"ContainerDied","Data":"f188464360dfa55fb1eda709a739e3147748149d0077a49ff10b5910f0bdc9c1"} Nov 29 14:51:26 crc kubenswrapper[4907]: I1129 14:51:26.923937 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68c5f6d545-tlmv5"] Nov 29 14:51:27 crc kubenswrapper[4907]: I1129 14:51:27.674811 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68c5f6d545-tlmv5" event={"ID":"e814e290-11f9-48bc-9f3d-36aeecf0ec1a","Type":"ContainerStarted","Data":"f72732dad9524287841640477bfa524356cec75b9e889fa9981074c76375ed2d"} Nov 29 14:51:28 crc kubenswrapper[4907]: I1129 14:51:28.490243 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:51:28 crc kubenswrapper[4907]: I1129 14:51:28.490584 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:51:28 crc kubenswrapper[4907]: I1129 14:51:28.492787 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:51:28 crc kubenswrapper[4907]: I1129 14:51:28.493606 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1627f8336c2b950c2441aa29da8e2bbe0bedafb1bb7676292ab2c4a335d23b1"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 14:51:28 crc kubenswrapper[4907]: I1129 14:51:28.493676 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://d1627f8336c2b950c2441aa29da8e2bbe0bedafb1bb7676292ab2c4a335d23b1" gracePeriod=600 Nov 29 14:51:28 crc kubenswrapper[4907]: I1129 14:51:28.708632 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="d1627f8336c2b950c2441aa29da8e2bbe0bedafb1bb7676292ab2c4a335d23b1" exitCode=0 Nov 29 14:51:28 crc kubenswrapper[4907]: I1129 14:51:28.708699 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"d1627f8336c2b950c2441aa29da8e2bbe0bedafb1bb7676292ab2c4a335d23b1"} Nov 29 14:51:28 crc kubenswrapper[4907]: I1129 14:51:28.708736 4907 scope.go:117] "RemoveContainer" containerID="b8e5b56ee968d515ff618b3f298ba561ab70814c2dd33e300a89c15ce55549c1" Nov 29 14:51:29 crc kubenswrapper[4907]: I1129 14:51:29.745202 4907 generic.go:334] "Generic (PLEG): container finished" podID="de06d815-0165-4c7a-aeed-fda3a647ba27" containerID="63156ddbddad132ef95bfee0567526bc1dfc5f35237a842109c35d2633f01127" exitCode=0 Nov 29 14:51:29 crc kubenswrapper[4907]: I1129 14:51:29.745306 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6csw2" event={"ID":"de06d815-0165-4c7a-aeed-fda3a647ba27","Type":"ContainerDied","Data":"63156ddbddad132ef95bfee0567526bc1dfc5f35237a842109c35d2633f01127"} Nov 29 14:51:29 crc kubenswrapper[4907]: I1129 14:51:29.747723 4907 generic.go:334] "Generic (PLEG): container finished" podID="0089fab2-d07c-4dad-bce1-a4c085a35d24" containerID="0bd911ebf8c3afdfa355fd7cbb762bd98e178dc4b1647485e73217b0b00edbab" exitCode=0 Nov 29 14:51:29 crc kubenswrapper[4907]: I1129 14:51:29.747793 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-92jsx" event={"ID":"0089fab2-d07c-4dad-bce1-a4c085a35d24","Type":"ContainerDied","Data":"0bd911ebf8c3afdfa355fd7cbb762bd98e178dc4b1647485e73217b0b00edbab"} Nov 29 14:51:29 crc kubenswrapper[4907]: I1129 14:51:29.749976 4907 generic.go:334] "Generic (PLEG): container finished" podID="a942b3b6-791f-4d18-abdd-113f3372158b" containerID="6f3c544987c7090c1b6dc149e2e9b8dfd7e51ae1a09aafdfef7b6b69d3838678" exitCode=0 Nov 29 14:51:29 crc kubenswrapper[4907]: I1129 14:51:29.750046 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-chkqj" event={"ID":"a942b3b6-791f-4d18-abdd-113f3372158b","Type":"ContainerDied","Data":"6f3c544987c7090c1b6dc149e2e9b8dfd7e51ae1a09aafdfef7b6b69d3838678"} Nov 29 14:51:29 crc kubenswrapper[4907]: I1129 14:51:29.988414 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:51:30 crc kubenswrapper[4907]: I1129 14:51:30.075174 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-tg5km"] Nov 29 14:51:30 crc kubenswrapper[4907]: I1129 14:51:30.075491 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" podUID="aeb5490b-3d2d-4527-91f5-b6a93c7fed08" containerName="dnsmasq-dns" containerID="cri-o://526e56fe0320227fd4345fc96fa4dc53890c9e5c0791bb167e38b075ab06fbb1" gracePeriod=10 Nov 29 14:51:30 crc kubenswrapper[4907]: I1129 14:51:30.767752 4907 generic.go:334] "Generic (PLEG): container finished" podID="aeb5490b-3d2d-4527-91f5-b6a93c7fed08" containerID="526e56fe0320227fd4345fc96fa4dc53890c9e5c0791bb167e38b075ab06fbb1" exitCode=0 Nov 29 14:51:30 crc kubenswrapper[4907]: I1129 14:51:30.769149 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" event={"ID":"aeb5490b-3d2d-4527-91f5-b6a93c7fed08","Type":"ContainerDied","Data":"526e56fe0320227fd4345fc96fa4dc53890c9e5c0791bb167e38b075ab06fbb1"} Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.148873 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6csw2" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.196905 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.200595 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de06d815-0165-4c7a-aeed-fda3a647ba27-logs\") pod \"de06d815-0165-4c7a-aeed-fda3a647ba27\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.201210 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-config-data\") pod \"de06d815-0165-4c7a-aeed-fda3a647ba27\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.201276 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-combined-ca-bundle\") pod \"de06d815-0165-4c7a-aeed-fda3a647ba27\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.201108 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de06d815-0165-4c7a-aeed-fda3a647ba27-logs" (OuterVolumeSpecName: "logs") pod "de06d815-0165-4c7a-aeed-fda3a647ba27" (UID: "de06d815-0165-4c7a-aeed-fda3a647ba27"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.201540 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qw7t\" (UniqueName: \"kubernetes.io/projected/de06d815-0165-4c7a-aeed-fda3a647ba27-kube-api-access-4qw7t\") pod \"de06d815-0165-4c7a-aeed-fda3a647ba27\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.201569 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-scripts\") pod \"de06d815-0165-4c7a-aeed-fda3a647ba27\" (UID: \"de06d815-0165-4c7a-aeed-fda3a647ba27\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.202164 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de06d815-0165-4c7a-aeed-fda3a647ba27-logs\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.207788 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de06d815-0165-4c7a-aeed-fda3a647ba27-kube-api-access-4qw7t" (OuterVolumeSpecName: "kube-api-access-4qw7t") pod "de06d815-0165-4c7a-aeed-fda3a647ba27" (UID: "de06d815-0165-4c7a-aeed-fda3a647ba27"). InnerVolumeSpecName "kube-api-access-4qw7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.224533 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-scripts" (OuterVolumeSpecName: "scripts") pod "de06d815-0165-4c7a-aeed-fda3a647ba27" (UID: "de06d815-0165-4c7a-aeed-fda3a647ba27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.254405 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-config-data" (OuterVolumeSpecName: "config-data") pod "de06d815-0165-4c7a-aeed-fda3a647ba27" (UID: "de06d815-0165-4c7a-aeed-fda3a647ba27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.282652 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.310039 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/253433fc-8f60-4aea-af99-a25c2e872cb4-httpd-run\") pod \"253433fc-8f60-4aea-af99-a25c2e872cb4\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.310124 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-config-data\") pod \"253433fc-8f60-4aea-af99-a25c2e872cb4\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.310193 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd2gr\" (UniqueName: \"kubernetes.io/projected/4fe218c8-f724-425b-ad67-d5ac967bc0c9-kube-api-access-qd2gr\") pod \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.310268 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fe218c8-f724-425b-ad67-d5ac967bc0c9-logs\") pod \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.310314 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-scripts\") pod \"253433fc-8f60-4aea-af99-a25c2e872cb4\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.310359 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-combined-ca-bundle\") pod \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.310384 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fe218c8-f724-425b-ad67-d5ac967bc0c9-httpd-run\") pod \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.310416 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-config-data\") pod \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.310466 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-combined-ca-bundle\") pod \"253433fc-8f60-4aea-af99-a25c2e872cb4\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.310510 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253433fc-8f60-4aea-af99-a25c2e872cb4-logs\") pod \"253433fc-8f60-4aea-af99-a25c2e872cb4\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.310569 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-scripts\") pod \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.310609 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\" (UID: \"4fe218c8-f724-425b-ad67-d5ac967bc0c9\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.310633 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"253433fc-8f60-4aea-af99-a25c2e872cb4\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.310739 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q69v\" (UniqueName: \"kubernetes.io/projected/253433fc-8f60-4aea-af99-a25c2e872cb4-kube-api-access-8q69v\") pod \"253433fc-8f60-4aea-af99-a25c2e872cb4\" (UID: \"253433fc-8f60-4aea-af99-a25c2e872cb4\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.311713 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.311739 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qw7t\" (UniqueName: \"kubernetes.io/projected/de06d815-0165-4c7a-aeed-fda3a647ba27-kube-api-access-4qw7t\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.311750 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.332256 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fe218c8-f724-425b-ad67-d5ac967bc0c9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4fe218c8-f724-425b-ad67-d5ac967bc0c9" (UID: "4fe218c8-f724-425b-ad67-d5ac967bc0c9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.332815 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fe218c8-f724-425b-ad67-d5ac967bc0c9-logs" (OuterVolumeSpecName: "logs") pod "4fe218c8-f724-425b-ad67-d5ac967bc0c9" (UID: "4fe218c8-f724-425b-ad67-d5ac967bc0c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.333149 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/253433fc-8f60-4aea-af99-a25c2e872cb4-logs" (OuterVolumeSpecName: "logs") pod "253433fc-8f60-4aea-af99-a25c2e872cb4" (UID: "253433fc-8f60-4aea-af99-a25c2e872cb4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.333328 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/253433fc-8f60-4aea-af99-a25c2e872cb4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "253433fc-8f60-4aea-af99-a25c2e872cb4" (UID: "253433fc-8f60-4aea-af99-a25c2e872cb4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.335755 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/253433fc-8f60-4aea-af99-a25c2e872cb4-kube-api-access-8q69v" (OuterVolumeSpecName: "kube-api-access-8q69v") pod "253433fc-8f60-4aea-af99-a25c2e872cb4" (UID: "253433fc-8f60-4aea-af99-a25c2e872cb4"). InnerVolumeSpecName "kube-api-access-8q69v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.344796 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-scripts" (OuterVolumeSpecName: "scripts") pod "253433fc-8f60-4aea-af99-a25c2e872cb4" (UID: "253433fc-8f60-4aea-af99-a25c2e872cb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.355016 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "4fe218c8-f724-425b-ad67-d5ac967bc0c9" (UID: "4fe218c8-f724-425b-ad67-d5ac967bc0c9"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.357337 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe218c8-f724-425b-ad67-d5ac967bc0c9-kube-api-access-qd2gr" (OuterVolumeSpecName: "kube-api-access-qd2gr") pod "4fe218c8-f724-425b-ad67-d5ac967bc0c9" (UID: "4fe218c8-f724-425b-ad67-d5ac967bc0c9"). InnerVolumeSpecName "kube-api-access-qd2gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.381665 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "253433fc-8f60-4aea-af99-a25c2e872cb4" (UID: "253433fc-8f60-4aea-af99-a25c2e872cb4"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.398430 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-scripts" (OuterVolumeSpecName: "scripts") pod "4fe218c8-f724-425b-ad67-d5ac967bc0c9" (UID: "4fe218c8-f724-425b-ad67-d5ac967bc0c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.414206 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd2gr\" (UniqueName: \"kubernetes.io/projected/4fe218c8-f724-425b-ad67-d5ac967bc0c9-kube-api-access-qd2gr\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.414234 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fe218c8-f724-425b-ad67-d5ac967bc0c9-logs\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.414245 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.414255 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fe218c8-f724-425b-ad67-d5ac967bc0c9-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.414265 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253433fc-8f60-4aea-af99-a25c2e872cb4-logs\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.414273 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.414294 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.414312 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.414321 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q69v\" (UniqueName: \"kubernetes.io/projected/253433fc-8f60-4aea-af99-a25c2e872cb4-kube-api-access-8q69v\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.414332 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/253433fc-8f60-4aea-af99-a25c2e872cb4-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.481597 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.498722 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fe218c8-f724-425b-ad67-d5ac967bc0c9" (UID: "4fe218c8-f724-425b-ad67-d5ac967bc0c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.519033 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.519064 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.537894 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de06d815-0165-4c7a-aeed-fda3a647ba27" (UID: "de06d815-0165-4c7a-aeed-fda3a647ba27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.559928 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.561767 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "253433fc-8f60-4aea-af99-a25c2e872cb4" (UID: "253433fc-8f60-4aea-af99-a25c2e872cb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.565593 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.587089 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-config-data" (OuterVolumeSpecName: "config-data") pod "253433fc-8f60-4aea-af99-a25c2e872cb4" (UID: "253433fc-8f60-4aea-af99-a25c2e872cb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.588792 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-92jsx" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.626992 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df5rb\" (UniqueName: \"kubernetes.io/projected/a942b3b6-791f-4d18-abdd-113f3372158b-kube-api-access-df5rb\") pod \"a942b3b6-791f-4d18-abdd-113f3372158b\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.627064 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-credential-keys\") pod \"a942b3b6-791f-4d18-abdd-113f3372158b\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.627187 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-config-data\") pod \"a942b3b6-791f-4d18-abdd-113f3372158b\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.627318 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-fernet-keys\") pod \"a942b3b6-791f-4d18-abdd-113f3372158b\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.627349 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0089fab2-d07c-4dad-bce1-a4c085a35d24-db-sync-config-data\") pod \"0089fab2-d07c-4dad-bce1-a4c085a35d24\" (UID: \"0089fab2-d07c-4dad-bce1-a4c085a35d24\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.627379 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0089fab2-d07c-4dad-bce1-a4c085a35d24-combined-ca-bundle\") pod \"0089fab2-d07c-4dad-bce1-a4c085a35d24\" (UID: \"0089fab2-d07c-4dad-bce1-a4c085a35d24\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.627420 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvrnk\" (UniqueName: \"kubernetes.io/projected/0089fab2-d07c-4dad-bce1-a4c085a35d24-kube-api-access-lvrnk\") pod \"0089fab2-d07c-4dad-bce1-a4c085a35d24\" (UID: \"0089fab2-d07c-4dad-bce1-a4c085a35d24\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.627463 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-scripts\") pod \"a942b3b6-791f-4d18-abdd-113f3372158b\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.627487 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-combined-ca-bundle\") pod \"a942b3b6-791f-4d18-abdd-113f3372158b\" (UID: \"a942b3b6-791f-4d18-abdd-113f3372158b\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.627915 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.627941 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.627953 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de06d815-0165-4c7a-aeed-fda3a647ba27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.627966 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253433fc-8f60-4aea-af99-a25c2e872cb4-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.632910 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a942b3b6-791f-4d18-abdd-113f3372158b-kube-api-access-df5rb" (OuterVolumeSpecName: "kube-api-access-df5rb") pod "a942b3b6-791f-4d18-abdd-113f3372158b" (UID: "a942b3b6-791f-4d18-abdd-113f3372158b"). InnerVolumeSpecName "kube-api-access-df5rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.642547 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a942b3b6-791f-4d18-abdd-113f3372158b" (UID: "a942b3b6-791f-4d18-abdd-113f3372158b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.642703 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0089fab2-d07c-4dad-bce1-a4c085a35d24-kube-api-access-lvrnk" (OuterVolumeSpecName: "kube-api-access-lvrnk") pod "0089fab2-d07c-4dad-bce1-a4c085a35d24" (UID: "0089fab2-d07c-4dad-bce1-a4c085a35d24"). InnerVolumeSpecName "kube-api-access-lvrnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.647990 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-scripts" (OuterVolumeSpecName: "scripts") pod "a942b3b6-791f-4d18-abdd-113f3372158b" (UID: "a942b3b6-791f-4d18-abdd-113f3372158b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.649522 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a942b3b6-791f-4d18-abdd-113f3372158b" (UID: "a942b3b6-791f-4d18-abdd-113f3372158b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.659403 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.675087 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0089fab2-d07c-4dad-bce1-a4c085a35d24-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0089fab2-d07c-4dad-bce1-a4c085a35d24" (UID: "0089fab2-d07c-4dad-bce1-a4c085a35d24"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.729290 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7gfb\" (UniqueName: \"kubernetes.io/projected/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-kube-api-access-g7gfb\") pod \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.729336 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-dns-svc\") pod \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.729469 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-ovsdbserver-nb\") pod \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.729514 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-config\") pod \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.729671 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-ovsdbserver-sb\") pod \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\" (UID: \"aeb5490b-3d2d-4527-91f5-b6a93c7fed08\") " Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.730090 4907 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.730105 4907 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0089fab2-d07c-4dad-bce1-a4c085a35d24-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.730116 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvrnk\" (UniqueName: \"kubernetes.io/projected/0089fab2-d07c-4dad-bce1-a4c085a35d24-kube-api-access-lvrnk\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.730124 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.730132 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df5rb\" (UniqueName: \"kubernetes.io/projected/a942b3b6-791f-4d18-abdd-113f3372158b-kube-api-access-df5rb\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.730142 4907 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.734733 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-kube-api-access-g7gfb" (OuterVolumeSpecName: "kube-api-access-g7gfb") pod "aeb5490b-3d2d-4527-91f5-b6a93c7fed08" (UID: "aeb5490b-3d2d-4527-91f5-b6a93c7fed08"). InnerVolumeSpecName "kube-api-access-g7gfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.814383 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-chkqj" event={"ID":"a942b3b6-791f-4d18-abdd-113f3372158b","Type":"ContainerDied","Data":"4de953567c16603e227665c74673900f498265ca30fe73ba6963998328de5d7a"} Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.814427 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4de953567c16603e227665c74673900f498265ca30fe73ba6963998328de5d7a" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.814554 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-chkqj" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.819386 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fe218c8-f724-425b-ad67-d5ac967bc0c9","Type":"ContainerDied","Data":"9c94e1df360ece51c75b1fb63c6259c6d9c0e53d55e78bd3d7a8dff0e05b5e4b"} Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.819474 4907 scope.go:117] "RemoveContainer" containerID="f188464360dfa55fb1eda709a739e3147748149d0077a49ff10b5910f0bdc9c1" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.819632 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.822791 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6csw2" event={"ID":"de06d815-0165-4c7a-aeed-fda3a647ba27","Type":"ContainerDied","Data":"b03306e63e50ef4f821ce689af421c66c0187f5f9914317856308aff8f1efc96"} Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.822820 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b03306e63e50ef4f821ce689af421c66c0187f5f9914317856308aff8f1efc96" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.822886 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6csw2" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.830166 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f"} Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.832178 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7gfb\" (UniqueName: \"kubernetes.io/projected/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-kube-api-access-g7gfb\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.835141 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" event={"ID":"aeb5490b-3d2d-4527-91f5-b6a93c7fed08","Type":"ContainerDied","Data":"a106873ef5938b3b19fd9d89020d5b47c2dc5fd43ac79982ecc7ea3ed9961f28"} Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.835252 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-tg5km" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.838176 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-92jsx" event={"ID":"0089fab2-d07c-4dad-bce1-a4c085a35d24","Type":"ContainerDied","Data":"6e1bdc255e22cd5ad1e335ebf3516d3911678d6bfb24a607c302be37f9fac3db"} Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.838218 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e1bdc255e22cd5ad1e335ebf3516d3911678d6bfb24a607c302be37f9fac3db" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.838277 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-92jsx" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.844843 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68c5f6d545-tlmv5" event={"ID":"e814e290-11f9-48bc-9f3d-36aeecf0ec1a","Type":"ContainerStarted","Data":"805366096d2e3c28e98c7a5d3a355b99c47f7a7ea2d44e985952c8b53aacf75c"} Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.854032 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"253433fc-8f60-4aea-af99-a25c2e872cb4","Type":"ContainerDied","Data":"6a463b8ae5923b0f5fdde1deb87f91f3c241ec9d636f48fa45fbc2c82fcd1051"} Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.854134 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.872048 4907 scope.go:117] "RemoveContainer" containerID="3d90897aa591adced08fec0122cbdf378b551fdabbf671aac4d6664ca433bfe3" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.914472 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a942b3b6-791f-4d18-abdd-113f3372158b" (UID: "a942b3b6-791f-4d18-abdd-113f3372158b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.934878 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.962682 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0089fab2-d07c-4dad-bce1-a4c085a35d24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0089fab2-d07c-4dad-bce1-a4c085a35d24" (UID: "0089fab2-d07c-4dad-bce1-a4c085a35d24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:33 crc kubenswrapper[4907]: I1129 14:51:33.997757 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-config-data" (OuterVolumeSpecName: "config-data") pod "a942b3b6-791f-4d18-abdd-113f3372158b" (UID: "a942b3b6-791f-4d18-abdd-113f3372158b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.002601 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-config-data" (OuterVolumeSpecName: "config-data") pod "4fe218c8-f724-425b-ad67-d5ac967bc0c9" (UID: "4fe218c8-f724-425b-ad67-d5ac967bc0c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.011654 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aeb5490b-3d2d-4527-91f5-b6a93c7fed08" (UID: "aeb5490b-3d2d-4527-91f5-b6a93c7fed08"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.019509 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-config" (OuterVolumeSpecName: "config") pod "aeb5490b-3d2d-4527-91f5-b6a93c7fed08" (UID: "aeb5490b-3d2d-4527-91f5-b6a93c7fed08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.022716 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aeb5490b-3d2d-4527-91f5-b6a93c7fed08" (UID: "aeb5490b-3d2d-4527-91f5-b6a93c7fed08"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.025813 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aeb5490b-3d2d-4527-91f5-b6a93c7fed08" (UID: "aeb5490b-3d2d-4527-91f5-b6a93c7fed08"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.039326 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.039355 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.039365 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a942b3b6-791f-4d18-abdd-113f3372158b-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.039374 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fe218c8-f724-425b-ad67-d5ac967bc0c9-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.039382 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.039390 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0089fab2-d07c-4dad-bce1-a4c085a35d24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.039398 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeb5490b-3d2d-4527-91f5-b6a93c7fed08-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.214741 4907 scope.go:117] "RemoveContainer" containerID="526e56fe0320227fd4345fc96fa4dc53890c9e5c0791bb167e38b075ab06fbb1" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.283141 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-tg5km"] Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.283740 4907 scope.go:117] "RemoveContainer" containerID="7965bddea3532b07e619eb7e7edd180e8e0b0b8c3f238bf0a9032a01221da145" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.306350 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-tg5km"] Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.326432 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.335921 4907 scope.go:117] "RemoveContainer" containerID="3617d73d27bc485d3d7b3b04b850b212caee2729d5f66c2fdeecbf7a2305ec10" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.345099 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-65859db6b4-hwsds"] Nov 29 14:51:34 crc kubenswrapper[4907]: E1129 14:51:34.345724 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe218c8-f724-425b-ad67-d5ac967bc0c9" containerName="glance-log" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.345817 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe218c8-f724-425b-ad67-d5ac967bc0c9" containerName="glance-log" Nov 29 14:51:34 crc kubenswrapper[4907]: E1129 14:51:34.345895 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe218c8-f724-425b-ad67-d5ac967bc0c9" containerName="glance-httpd" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.345961 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe218c8-f724-425b-ad67-d5ac967bc0c9" containerName="glance-httpd" Nov 29 14:51:34 crc kubenswrapper[4907]: E1129 14:51:34.346035 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a942b3b6-791f-4d18-abdd-113f3372158b" containerName="keystone-bootstrap" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.346094 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a942b3b6-791f-4d18-abdd-113f3372158b" containerName="keystone-bootstrap" Nov 29 14:51:34 crc kubenswrapper[4907]: E1129 14:51:34.346162 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb5490b-3d2d-4527-91f5-b6a93c7fed08" containerName="dnsmasq-dns" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.346211 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb5490b-3d2d-4527-91f5-b6a93c7fed08" containerName="dnsmasq-dns" Nov 29 14:51:34 crc kubenswrapper[4907]: E1129 14:51:34.346261 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de06d815-0165-4c7a-aeed-fda3a647ba27" containerName="placement-db-sync" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.346310 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="de06d815-0165-4c7a-aeed-fda3a647ba27" containerName="placement-db-sync" Nov 29 14:51:34 crc kubenswrapper[4907]: E1129 14:51:34.346367 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253433fc-8f60-4aea-af99-a25c2e872cb4" containerName="glance-httpd" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.346421 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="253433fc-8f60-4aea-af99-a25c2e872cb4" containerName="glance-httpd" Nov 29 14:51:34 crc kubenswrapper[4907]: E1129 14:51:34.346498 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253433fc-8f60-4aea-af99-a25c2e872cb4" containerName="glance-log" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.346547 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="253433fc-8f60-4aea-af99-a25c2e872cb4" containerName="glance-log" Nov 29 14:51:34 crc kubenswrapper[4907]: E1129 14:51:34.346603 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0089fab2-d07c-4dad-bce1-a4c085a35d24" containerName="barbican-db-sync" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.346665 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0089fab2-d07c-4dad-bce1-a4c085a35d24" containerName="barbican-db-sync" Nov 29 14:51:34 crc kubenswrapper[4907]: E1129 14:51:34.346723 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb5490b-3d2d-4527-91f5-b6a93c7fed08" containerName="init" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.346790 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb5490b-3d2d-4527-91f5-b6a93c7fed08" containerName="init" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.347036 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe218c8-f724-425b-ad67-d5ac967bc0c9" containerName="glance-log" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.347100 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a942b3b6-791f-4d18-abdd-113f3372158b" containerName="keystone-bootstrap" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.347152 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb5490b-3d2d-4527-91f5-b6a93c7fed08" containerName="dnsmasq-dns" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.347213 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="253433fc-8f60-4aea-af99-a25c2e872cb4" containerName="glance-httpd" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.347276 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="de06d815-0165-4c7a-aeed-fda3a647ba27" containerName="placement-db-sync" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.347340 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0089fab2-d07c-4dad-bce1-a4c085a35d24" containerName="barbican-db-sync" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.347394 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe218c8-f724-425b-ad67-d5ac967bc0c9" containerName="glance-httpd" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.347473 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="253433fc-8f60-4aea-af99-a25c2e872cb4" containerName="glance-log" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.348704 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.371826 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.372079 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.372234 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.372353 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fm8g9" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.372457 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.372537 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.400585 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-65859db6b4-hwsds"] Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.428506 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.430749 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.436591 4907 scope.go:117] "RemoveContainer" containerID="74b57b043ee7cd71c9ccadd2d4b333fb7cef69afb6b5a49d9c4ad0cbe8119b47" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.442541 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.442793 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4qqtq" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.444808 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.445458 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.464858 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.519452 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="253433fc-8f60-4aea-af99-a25c2e872cb4" path="/var/lib/kubelet/pods/253433fc-8f60-4aea-af99-a25c2e872cb4/volumes" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.520249 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeb5490b-3d2d-4527-91f5-b6a93c7fed08" path="/var/lib/kubelet/pods/aeb5490b-3d2d-4527-91f5-b6a93c7fed08/volumes" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.520963 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.529698 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.549774 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d47b748c-ba00-496f-83d0-45aaa1049423-combined-ca-bundle\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.549822 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d47b748c-ba00-496f-83d0-45aaa1049423-internal-tls-certs\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.549855 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d47b748c-ba00-496f-83d0-45aaa1049423-public-tls-certs\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.549886 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.549934 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d47b748c-ba00-496f-83d0-45aaa1049423-config-data\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.549954 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.549971 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.549985 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b49cb25-4ca7-4382-b378-749bf7081894-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.550015 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.550038 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92hh5\" (UniqueName: \"kubernetes.io/projected/d47b748c-ba00-496f-83d0-45aaa1049423-kube-api-access-92hh5\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.550064 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.550089 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz6bp\" (UniqueName: \"kubernetes.io/projected/7b49cb25-4ca7-4382-b378-749bf7081894-kube-api-access-nz6bp\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.550108 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d47b748c-ba00-496f-83d0-45aaa1049423-scripts\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.550135 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b49cb25-4ca7-4382-b378-749bf7081894-logs\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.550176 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d47b748c-ba00-496f-83d0-45aaa1049423-logs\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.590525 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.627280 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.631909 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.632524 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.659623 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.659686 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b49cb25-4ca7-4382-b378-749bf7081894-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.659732 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.659782 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92hh5\" (UniqueName: \"kubernetes.io/projected/d47b748c-ba00-496f-83d0-45aaa1049423-kube-api-access-92hh5\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.659824 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.659867 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz6bp\" (UniqueName: \"kubernetes.io/projected/7b49cb25-4ca7-4382-b378-749bf7081894-kube-api-access-nz6bp\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.659885 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d47b748c-ba00-496f-83d0-45aaa1049423-scripts\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.659934 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b49cb25-4ca7-4382-b378-749bf7081894-logs\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.660016 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d47b748c-ba00-496f-83d0-45aaa1049423-logs\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.660069 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d47b748c-ba00-496f-83d0-45aaa1049423-combined-ca-bundle\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.660113 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d47b748c-ba00-496f-83d0-45aaa1049423-internal-tls-certs\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.660115 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.660209 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b49cb25-4ca7-4382-b378-749bf7081894-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.693829 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b49cb25-4ca7-4382-b378-749bf7081894-logs\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.700412 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d47b748c-ba00-496f-83d0-45aaa1049423-public-tls-certs\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.700511 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.700599 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d47b748c-ba00-496f-83d0-45aaa1049423-config-data\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.700625 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.705112 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d47b748c-ba00-496f-83d0-45aaa1049423-logs\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.715796 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.733634 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d47b748c-ba00-496f-83d0-45aaa1049423-combined-ca-bundle\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.734052 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d47b748c-ba00-496f-83d0-45aaa1049423-internal-tls-certs\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.734188 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.734655 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d47b748c-ba00-496f-83d0-45aaa1049423-scripts\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.734871 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d47b748c-ba00-496f-83d0-45aaa1049423-public-tls-certs\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.735221 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.735973 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.739923 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92hh5\" (UniqueName: \"kubernetes.io/projected/d47b748c-ba00-496f-83d0-45aaa1049423-kube-api-access-92hh5\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.755360 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d47b748c-ba00-496f-83d0-45aaa1049423-config-data\") pod \"placement-65859db6b4-hwsds\" (UID: \"d47b748c-ba00-496f-83d0-45aaa1049423\") " pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.759775 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz6bp\" (UniqueName: \"kubernetes.io/projected/7b49cb25-4ca7-4382-b378-749bf7081894-kube-api-access-nz6bp\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.760593 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.802625 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.802942 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-config-data\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.802967 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.803032 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-scripts\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.803108 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.803151 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b638c730-93e5-475d-afd6-b1c83c3e4952-logs\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.803173 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b638c730-93e5-475d-afd6-b1c83c3e4952-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.803261 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49lt\" (UniqueName: \"kubernetes.io/projected/b638c730-93e5-475d-afd6-b1c83c3e4952-kube-api-access-b49lt\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.815734 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-76d8ccf675-75wqf"] Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.819261 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.826702 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.826752 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.826902 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.826959 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vcrpv" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.827183 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.827319 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.846110 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76d8ccf675-75wqf"] Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910012 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cmmd\" (UniqueName: \"kubernetes.io/projected/9176edef-f683-4a54-a9b0-3ff55a80347b-kube-api-access-2cmmd\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910060 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-config-data\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910082 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-combined-ca-bundle\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910099 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-public-tls-certs\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910164 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910182 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-config-data\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910198 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910226 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-scripts\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910256 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-scripts\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910284 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910307 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b638c730-93e5-475d-afd6-b1c83c3e4952-logs\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910320 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b638c730-93e5-475d-afd6-b1c83c3e4952-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910357 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-credential-keys\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910381 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-internal-tls-certs\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910402 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49lt\" (UniqueName: \"kubernetes.io/projected/b638c730-93e5-475d-afd6-b1c83c3e4952-kube-api-access-b49lt\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.910458 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-fernet-keys\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.911700 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.914128 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b638c730-93e5-475d-afd6-b1c83c3e4952-logs\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.914344 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b638c730-93e5-475d-afd6-b1c83c3e4952-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.917036 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-scripts\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.919126 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.938603 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-bbc8f7595-4cqhq"] Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.939195 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.940358 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.946512 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.946793 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xn9gj" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.946924 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.968499 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-64d7c8d644-rz2mz"] Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.970775 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.973006 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.974706 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-config-data\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:34 crc kubenswrapper[4907]: I1129 14:51:34.981953 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:34.992492 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-bbc8f7595-4cqhq"] Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.002685 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7c3295c-d537-4302-80c1-ce39f0f4fcb4","Type":"ContainerStarted","Data":"93978f213f0dd7390dc158c33afad7e0b5684fa268d23626f48362b017a603ae"} Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.016545 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27887b0e-b017-4255-a5db-817cc7142898-config-data-custom\") pod \"barbican-worker-bbc8f7595-4cqhq\" (UID: \"27887b0e-b017-4255-a5db-817cc7142898\") " pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.016654 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-scripts\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.016694 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27887b0e-b017-4255-a5db-817cc7142898-combined-ca-bundle\") pod \"barbican-worker-bbc8f7595-4cqhq\" (UID: \"27887b0e-b017-4255-a5db-817cc7142898\") " pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.016729 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m88hn\" (UniqueName: \"kubernetes.io/projected/27887b0e-b017-4255-a5db-817cc7142898-kube-api-access-m88hn\") pod \"barbican-worker-bbc8f7595-4cqhq\" (UID: \"27887b0e-b017-4255-a5db-817cc7142898\") " pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.016755 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-credential-keys\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.016785 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-internal-tls-certs\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.016832 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27887b0e-b017-4255-a5db-817cc7142898-config-data\") pod \"barbican-worker-bbc8f7595-4cqhq\" (UID: \"27887b0e-b017-4255-a5db-817cc7142898\") " pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.016854 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-fernet-keys\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.016880 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cmmd\" (UniqueName: \"kubernetes.io/projected/9176edef-f683-4a54-a9b0-3ff55a80347b-kube-api-access-2cmmd\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.016898 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-config-data\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.016918 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-combined-ca-bundle\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.016934 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-public-tls-certs\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.016954 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27887b0e-b017-4255-a5db-817cc7142898-logs\") pod \"barbican-worker-bbc8f7595-4cqhq\" (UID: \"27887b0e-b017-4255-a5db-817cc7142898\") " pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.019190 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49lt\" (UniqueName: \"kubernetes.io/projected/b638c730-93e5-475d-afd6-b1c83c3e4952-kube-api-access-b49lt\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.027168 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-public-tls-certs\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.039483 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64d7c8d644-rz2mz"] Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.049752 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8lc4r" event={"ID":"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de","Type":"ContainerStarted","Data":"39a5a1f7616453bb870e47a8e4ac7c7bfe9eb960884396bf0b6e3d37581774be"} Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.050829 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-credential-keys\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.051184 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-combined-ca-bundle\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.058854 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-fernet-keys\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.059498 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-internal-tls-certs\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.060579 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-config-data\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.077319 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9176edef-f683-4a54-a9b0-3ff55a80347b-scripts\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.101407 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cmmd\" (UniqueName: \"kubernetes.io/projected/9176edef-f683-4a54-a9b0-3ff55a80347b-kube-api-access-2cmmd\") pod \"keystone-76d8ccf675-75wqf\" (UID: \"9176edef-f683-4a54-a9b0-3ff55a80347b\") " pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.114575 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-5vflp" event={"ID":"1e3e611f-5e4c-4b2c-baea-5f74745f315b","Type":"ContainerStarted","Data":"cb28a6d071c6ecb64928ca29f3a0ece93cd1195fca05b49b9612aa48dcedc1ed"} Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.118262 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz7sg\" (UniqueName: \"kubernetes.io/projected/f8bb23b2-9a00-4098-b349-ac5221a0d305-kube-api-access-pz7sg\") pod \"barbican-keystone-listener-64d7c8d644-rz2mz\" (UID: \"f8bb23b2-9a00-4098-b349-ac5221a0d305\") " pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.118296 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bb23b2-9a00-4098-b349-ac5221a0d305-combined-ca-bundle\") pod \"barbican-keystone-listener-64d7c8d644-rz2mz\" (UID: \"f8bb23b2-9a00-4098-b349-ac5221a0d305\") " pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.118345 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27887b0e-b017-4255-a5db-817cc7142898-combined-ca-bundle\") pod \"barbican-worker-bbc8f7595-4cqhq\" (UID: \"27887b0e-b017-4255-a5db-817cc7142898\") " pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.118386 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m88hn\" (UniqueName: \"kubernetes.io/projected/27887b0e-b017-4255-a5db-817cc7142898-kube-api-access-m88hn\") pod \"barbican-worker-bbc8f7595-4cqhq\" (UID: \"27887b0e-b017-4255-a5db-817cc7142898\") " pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.118422 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8bb23b2-9a00-4098-b349-ac5221a0d305-config-data\") pod \"barbican-keystone-listener-64d7c8d644-rz2mz\" (UID: \"f8bb23b2-9a00-4098-b349-ac5221a0d305\") " pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.118486 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27887b0e-b017-4255-a5db-817cc7142898-config-data\") pod \"barbican-worker-bbc8f7595-4cqhq\" (UID: \"27887b0e-b017-4255-a5db-817cc7142898\") " pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.118564 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27887b0e-b017-4255-a5db-817cc7142898-logs\") pod \"barbican-worker-bbc8f7595-4cqhq\" (UID: \"27887b0e-b017-4255-a5db-817cc7142898\") " pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.118601 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8bb23b2-9a00-4098-b349-ac5221a0d305-logs\") pod \"barbican-keystone-listener-64d7c8d644-rz2mz\" (UID: \"f8bb23b2-9a00-4098-b349-ac5221a0d305\") " pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.118645 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27887b0e-b017-4255-a5db-817cc7142898-config-data-custom\") pod \"barbican-worker-bbc8f7595-4cqhq\" (UID: \"27887b0e-b017-4255-a5db-817cc7142898\") " pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.118718 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8bb23b2-9a00-4098-b349-ac5221a0d305-config-data-custom\") pod \"barbican-keystone-listener-64d7c8d644-rz2mz\" (UID: \"f8bb23b2-9a00-4098-b349-ac5221a0d305\") " pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.125576 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-t4bvb"] Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.126399 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27887b0e-b017-4255-a5db-817cc7142898-logs\") pod \"barbican-worker-bbc8f7595-4cqhq\" (UID: \"27887b0e-b017-4255-a5db-817cc7142898\") " pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.128925 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.129725 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27887b0e-b017-4255-a5db-817cc7142898-combined-ca-bundle\") pod \"barbican-worker-bbc8f7595-4cqhq\" (UID: \"27887b0e-b017-4255-a5db-817cc7142898\") " pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.137683 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27887b0e-b017-4255-a5db-817cc7142898-config-data\") pod \"barbican-worker-bbc8f7595-4cqhq\" (UID: \"27887b0e-b017-4255-a5db-817cc7142898\") " pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.140285 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"84c56a47f97d62013b8da9d119a843ce1bab62847aa547a797434ec819580560"} Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.140322 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"2f5c5c81f6ef8ef1ae871b8df8b027436a2bb63da4d0897162b299225860193b"} Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.145733 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27887b0e-b017-4255-a5db-817cc7142898-config-data-custom\") pod \"barbican-worker-bbc8f7595-4cqhq\" (UID: \"27887b0e-b017-4255-a5db-817cc7142898\") " pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.156039 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m88hn\" (UniqueName: \"kubernetes.io/projected/27887b0e-b017-4255-a5db-817cc7142898-kube-api-access-m88hn\") pod \"barbican-worker-bbc8f7595-4cqhq\" (UID: \"27887b0e-b017-4255-a5db-817cc7142898\") " pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.163963 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-t4bvb"] Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.164362 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.208906 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.228928 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8bb23b2-9a00-4098-b349-ac5221a0d305-logs\") pod \"barbican-keystone-listener-64d7c8d644-rz2mz\" (UID: \"f8bb23b2-9a00-4098-b349-ac5221a0d305\") " pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.229159 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-ovsdbserver-sb\") pod \"dnsmasq-dns-7d649d8c65-t4bvb\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.229192 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8bb23b2-9a00-4098-b349-ac5221a0d305-config-data-custom\") pod \"barbican-keystone-listener-64d7c8d644-rz2mz\" (UID: \"f8bb23b2-9a00-4098-b349-ac5221a0d305\") " pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.229241 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-dns-svc\") pod \"dnsmasq-dns-7d649d8c65-t4bvb\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.229325 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz7sg\" (UniqueName: \"kubernetes.io/projected/f8bb23b2-9a00-4098-b349-ac5221a0d305-kube-api-access-pz7sg\") pod \"barbican-keystone-listener-64d7c8d644-rz2mz\" (UID: \"f8bb23b2-9a00-4098-b349-ac5221a0d305\") " pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.229406 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bb23b2-9a00-4098-b349-ac5221a0d305-combined-ca-bundle\") pod \"barbican-keystone-listener-64d7c8d644-rz2mz\" (UID: \"f8bb23b2-9a00-4098-b349-ac5221a0d305\") " pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.229498 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-config\") pod \"dnsmasq-dns-7d649d8c65-t4bvb\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.229534 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-ovsdbserver-nb\") pod \"dnsmasq-dns-7d649d8c65-t4bvb\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.229560 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8bb23b2-9a00-4098-b349-ac5221a0d305-config-data\") pod \"barbican-keystone-listener-64d7c8d644-rz2mz\" (UID: \"f8bb23b2-9a00-4098-b349-ac5221a0d305\") " pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.229757 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5xs9\" (UniqueName: \"kubernetes.io/projected/f003b36d-c2d9-432a-888b-2779c4b15bf3-kube-api-access-m5xs9\") pod \"dnsmasq-dns-7d649d8c65-t4bvb\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.230701 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8bb23b2-9a00-4098-b349-ac5221a0d305-logs\") pod \"barbican-keystone-listener-64d7c8d644-rz2mz\" (UID: \"f8bb23b2-9a00-4098-b349-ac5221a0d305\") " pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.235381 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5bf6df8b58-8kpjs"] Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.237694 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.240514 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " pod="openstack/glance-default-external-api-0" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.243825 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.246566 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-8lc4r" podStartSLOduration=5.015414152 podStartE2EDuration="51.246545536s" podCreationTimestamp="2025-11-29 14:50:44 +0000 UTC" firstStartedPulling="2025-11-29 14:50:47.126592219 +0000 UTC m=+1345.113429871" lastFinishedPulling="2025-11-29 14:51:33.357723603 +0000 UTC m=+1391.344561255" observedRunningTime="2025-11-29 14:51:35.080500473 +0000 UTC m=+1393.067338125" watchObservedRunningTime="2025-11-29 14:51:35.246545536 +0000 UTC m=+1393.233383188" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.253171 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8bb23b2-9a00-4098-b349-ac5221a0d305-config-data\") pod \"barbican-keystone-listener-64d7c8d644-rz2mz\" (UID: \"f8bb23b2-9a00-4098-b349-ac5221a0d305\") " pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.268013 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8bb23b2-9a00-4098-b349-ac5221a0d305-config-data-custom\") pod \"barbican-keystone-listener-64d7c8d644-rz2mz\" (UID: \"f8bb23b2-9a00-4098-b349-ac5221a0d305\") " pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.270995 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bb23b2-9a00-4098-b349-ac5221a0d305-combined-ca-bundle\") pod \"barbican-keystone-listener-64d7c8d644-rz2mz\" (UID: \"f8bb23b2-9a00-4098-b349-ac5221a0d305\") " pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.281009 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.286015 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz7sg\" (UniqueName: \"kubernetes.io/projected/f8bb23b2-9a00-4098-b349-ac5221a0d305-kube-api-access-pz7sg\") pod \"barbican-keystone-listener-64d7c8d644-rz2mz\" (UID: \"f8bb23b2-9a00-4098-b349-ac5221a0d305\") " pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.289088 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bf6df8b58-8kpjs"] Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.302278 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-5vflp" podStartSLOduration=5.592074777 podStartE2EDuration="52.302258064s" podCreationTimestamp="2025-11-29 14:50:43 +0000 UTC" firstStartedPulling="2025-11-29 14:50:46.64486942 +0000 UTC m=+1344.631707082" lastFinishedPulling="2025-11-29 14:51:33.355052717 +0000 UTC m=+1391.341890369" observedRunningTime="2025-11-29 14:51:35.155193002 +0000 UTC m=+1393.142030654" watchObservedRunningTime="2025-11-29 14:51:35.302258064 +0000 UTC m=+1393.289095716" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.333349 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a84a40-6906-41df-8af5-ba4c16a2c322-logs\") pod \"barbican-api-5bf6df8b58-8kpjs\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.333740 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5xs9\" (UniqueName: \"kubernetes.io/projected/f003b36d-c2d9-432a-888b-2779c4b15bf3-kube-api-access-m5xs9\") pod \"dnsmasq-dns-7d649d8c65-t4bvb\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.333812 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-combined-ca-bundle\") pod \"barbican-api-5bf6df8b58-8kpjs\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.333838 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgvkp\" (UniqueName: \"kubernetes.io/projected/e0a84a40-6906-41df-8af5-ba4c16a2c322-kube-api-access-mgvkp\") pod \"barbican-api-5bf6df8b58-8kpjs\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.333874 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-ovsdbserver-sb\") pod \"dnsmasq-dns-7d649d8c65-t4bvb\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.333892 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-config-data-custom\") pod \"barbican-api-5bf6df8b58-8kpjs\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.333918 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-dns-svc\") pod \"dnsmasq-dns-7d649d8c65-t4bvb\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.333946 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-config-data\") pod \"barbican-api-5bf6df8b58-8kpjs\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.333997 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-config\") pod \"dnsmasq-dns-7d649d8c65-t4bvb\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.334019 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-ovsdbserver-nb\") pod \"dnsmasq-dns-7d649d8c65-t4bvb\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.334983 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-ovsdbserver-nb\") pod \"dnsmasq-dns-7d649d8c65-t4bvb\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.335783 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-ovsdbserver-sb\") pod \"dnsmasq-dns-7d649d8c65-t4bvb\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.339252 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-config\") pod \"dnsmasq-dns-7d649d8c65-t4bvb\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.356568 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-dns-svc\") pod \"dnsmasq-dns-7d649d8c65-t4bvb\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.380202 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.391090 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5xs9\" (UniqueName: \"kubernetes.io/projected/f003b36d-c2d9-432a-888b-2779c4b15bf3-kube-api-access-m5xs9\") pod \"dnsmasq-dns-7d649d8c65-t4bvb\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.437900 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-combined-ca-bundle\") pod \"barbican-api-5bf6df8b58-8kpjs\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.438163 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgvkp\" (UniqueName: \"kubernetes.io/projected/e0a84a40-6906-41df-8af5-ba4c16a2c322-kube-api-access-mgvkp\") pod \"barbican-api-5bf6df8b58-8kpjs\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.438207 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-config-data-custom\") pod \"barbican-api-5bf6df8b58-8kpjs\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.438241 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-config-data\") pod \"barbican-api-5bf6df8b58-8kpjs\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.438313 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a84a40-6906-41df-8af5-ba4c16a2c322-logs\") pod \"barbican-api-5bf6df8b58-8kpjs\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.438841 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a84a40-6906-41df-8af5-ba4c16a2c322-logs\") pod \"barbican-api-5bf6df8b58-8kpjs\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.446778 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-config-data-custom\") pod \"barbican-api-5bf6df8b58-8kpjs\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.464814 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-config-data\") pod \"barbican-api-5bf6df8b58-8kpjs\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.465552 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-combined-ca-bundle\") pod \"barbican-api-5bf6df8b58-8kpjs\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.476883 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgvkp\" (UniqueName: \"kubernetes.io/projected/e0a84a40-6906-41df-8af5-ba4c16a2c322-kube-api-access-mgvkp\") pod \"barbican-api-5bf6df8b58-8kpjs\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.524255 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-bbc8f7595-4cqhq" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.545362 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.571894 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.589483 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:35 crc kubenswrapper[4907]: I1129 14:51:35.859625 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76d8ccf675-75wqf"] Nov 29 14:51:35 crc kubenswrapper[4907]: W1129 14:51:35.938550 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9176edef_f683_4a54_a9b0_3ff55a80347b.slice/crio-905481a137cbbcde124112a5f85fe8bc8ad52699e98fa52f372a91f90fd778bf WatchSource:0}: Error finding container 905481a137cbbcde124112a5f85fe8bc8ad52699e98fa52f372a91f90fd778bf: Status 404 returned error can't find the container with id 905481a137cbbcde124112a5f85fe8bc8ad52699e98fa52f372a91f90fd778bf Nov 29 14:51:36 crc kubenswrapper[4907]: I1129 14:51:36.176943 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-65859db6b4-hwsds"] Nov 29 14:51:36 crc kubenswrapper[4907]: I1129 14:51:36.187674 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76d8ccf675-75wqf" event={"ID":"9176edef-f683-4a54-a9b0-3ff55a80347b","Type":"ContainerStarted","Data":"905481a137cbbcde124112a5f85fe8bc8ad52699e98fa52f372a91f90fd778bf"} Nov 29 14:51:36 crc kubenswrapper[4907]: I1129 14:51:36.198320 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68c5f6d545-tlmv5" event={"ID":"e814e290-11f9-48bc-9f3d-36aeecf0ec1a","Type":"ContainerStarted","Data":"1c5297901a383418a2b24c569ba1260a486b34841dac783f6c9a13e7efbe241a"} Nov 29 14:51:36 crc kubenswrapper[4907]: I1129 14:51:36.204826 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:36 crc kubenswrapper[4907]: I1129 14:51:36.247798 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-68c5f6d545-tlmv5" podStartSLOduration=11.247780762 podStartE2EDuration="11.247780762s" podCreationTimestamp="2025-11-29 14:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:36.246767153 +0000 UTC m=+1394.233604805" watchObservedRunningTime="2025-11-29 14:51:36.247780762 +0000 UTC m=+1394.234618414" Nov 29 14:51:36 crc kubenswrapper[4907]: I1129 14:51:36.286276 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"210d776036002fa201615072f5136c7d5fd62fbde45aa8f167f78266161a3aae"} Nov 29 14:51:36 crc kubenswrapper[4907]: I1129 14:51:36.514430 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fe218c8-f724-425b-ad67-d5ac967bc0c9" path="/var/lib/kubelet/pods/4fe218c8-f724-425b-ad67-d5ac967bc0c9/volumes" Nov 29 14:51:36 crc kubenswrapper[4907]: I1129 14:51:36.619285 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-bbc8f7595-4cqhq"] Nov 29 14:51:36 crc kubenswrapper[4907]: I1129 14:51:36.685135 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 14:51:36 crc kubenswrapper[4907]: I1129 14:51:36.803922 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.050539 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64d7c8d644-rz2mz"] Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.095308 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bf6df8b58-8kpjs"] Nov 29 14:51:37 crc kubenswrapper[4907]: W1129 14:51:37.104805 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0a84a40_6906_41df_8af5_ba4c16a2c322.slice/crio-1971a4b096def06be1875e18535da5a9534a4166ecd592bc64ed558af9b87577 WatchSource:0}: Error finding container 1971a4b096def06be1875e18535da5a9534a4166ecd592bc64ed558af9b87577: Status 404 returned error can't find the container with id 1971a4b096def06be1875e18535da5a9534a4166ecd592bc64ed558af9b87577 Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.112568 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-t4bvb"] Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.368304 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"c1a02cc88eaae58b0876609b03d080c4cc05388599d22501729f0fbd9ded52b3"} Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.368353 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"4a2ca1d14d32901d6889551ca1eddaffc4e0e936f74901c1e313d6e8451132a3"} Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.371481 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bf6df8b58-8kpjs" event={"ID":"e0a84a40-6906-41df-8af5-ba4c16a2c322","Type":"ContainerStarted","Data":"1971a4b096def06be1875e18535da5a9534a4166ecd592bc64ed558af9b87577"} Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.391682 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b638c730-93e5-475d-afd6-b1c83c3e4952","Type":"ContainerStarted","Data":"38cb7a61de30e6dfa404f4c379ff5f86300d90e1f3627139dc23b53d425918cc"} Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.394629 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" event={"ID":"f003b36d-c2d9-432a-888b-2779c4b15bf3","Type":"ContainerStarted","Data":"196aac2365943821866ca81cb26e8b31876e14e166eddf50f8cd67e8bc6ae02a"} Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.407857 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65859db6b4-hwsds" event={"ID":"d47b748c-ba00-496f-83d0-45aaa1049423","Type":"ContainerStarted","Data":"e276bcbcd72cea59e22d1506319824cc02126c08d60a41c08875a963d7b8f8c9"} Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.407901 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65859db6b4-hwsds" event={"ID":"d47b748c-ba00-496f-83d0-45aaa1049423","Type":"ContainerStarted","Data":"50ab28ad5c2dea824db1c60bd7b3b7f605179e53ff755617b9eabf2776d72cbb"} Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.412144 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" event={"ID":"f8bb23b2-9a00-4098-b349-ac5221a0d305","Type":"ContainerStarted","Data":"4faa81549573342bb03e7f0cf64d7f78bd8c82e66a34b7fd3d4c5054f20aceec"} Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.426648 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76d8ccf675-75wqf" event={"ID":"9176edef-f683-4a54-a9b0-3ff55a80347b","Type":"ContainerStarted","Data":"632b0b9166ea585efa28af4c94060f100cf89f50af802e2d04044ffcb5b052ce"} Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.427118 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.432386 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bbc8f7595-4cqhq" event={"ID":"27887b0e-b017-4255-a5db-817cc7142898","Type":"ContainerStarted","Data":"7a1d5fdcda18c0ac1749c7b7f1d4e82d1c0af842082a646992bb94906758a414"} Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.435731 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b49cb25-4ca7-4382-b378-749bf7081894","Type":"ContainerStarted","Data":"c97cfc8b0c751b4b13ab7193896660bf44f991eef419d5bab7d33cf41f6cac48"} Nov 29 14:51:37 crc kubenswrapper[4907]: I1129 14:51:37.454041 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-76d8ccf675-75wqf" podStartSLOduration=3.454024492 podStartE2EDuration="3.454024492s" podCreationTimestamp="2025-11-29 14:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:37.453010013 +0000 UTC m=+1395.439847665" watchObservedRunningTime="2025-11-29 14:51:37.454024492 +0000 UTC m=+1395.440862154" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.524973 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65859db6b4-hwsds" event={"ID":"d47b748c-ba00-496f-83d0-45aaa1049423","Type":"ContainerStarted","Data":"11e4878691f8bbb10cb14c554ebc41446a73ec48cc0f805e2fd24065c1375b8e"} Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.525544 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.525556 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.531833 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-c879c8666-gfj6k"] Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.533786 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.537700 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.537871 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.540639 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bf6df8b58-8kpjs" event={"ID":"e0a84a40-6906-41df-8af5-ba4c16a2c322","Type":"ContainerStarted","Data":"1b7702135179249a1b7acc97dda04658e6c200182057c477fe9b98a731c287a6"} Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.540666 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bf6df8b58-8kpjs" event={"ID":"e0a84a40-6906-41df-8af5-ba4c16a2c322","Type":"ContainerStarted","Data":"4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a"} Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.541535 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.541564 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.555075 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c879c8666-gfj6k"] Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.556576 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-65859db6b4-hwsds" podStartSLOduration=4.556557864 podStartE2EDuration="4.556557864s" podCreationTimestamp="2025-11-29 14:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:38.528930207 +0000 UTC m=+1396.515767859" watchObservedRunningTime="2025-11-29 14:51:38.556557864 +0000 UTC m=+1396.543395516" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.595356 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b638c730-93e5-475d-afd6-b1c83c3e4952","Type":"ContainerStarted","Data":"a06af6f54d1a144582ea59ec2783f88120c05fdbd4d7f559c2afea2ad660f6db"} Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.600300 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b49cb25-4ca7-4382-b378-749bf7081894","Type":"ContainerStarted","Data":"fae5b500fdd97fec81f970e4aa5152fe8be097c18c5be82a016be44e262b1425"} Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.603824 4907 generic.go:334] "Generic (PLEG): container finished" podID="f003b36d-c2d9-432a-888b-2779c4b15bf3" containerID="09451d9261ff6cbb66e0628b6890af9059287997a4be421373a5b22a56b671c5" exitCode=0 Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.603876 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" event={"ID":"f003b36d-c2d9-432a-888b-2779c4b15bf3","Type":"ContainerDied","Data":"09451d9261ff6cbb66e0628b6890af9059287997a4be421373a5b22a56b671c5"} Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.608074 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podStartSLOduration=3.608054032 podStartE2EDuration="3.608054032s" podCreationTimestamp="2025-11-29 14:51:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:38.577951534 +0000 UTC m=+1396.564789186" watchObservedRunningTime="2025-11-29 14:51:38.608054032 +0000 UTC m=+1396.594891684" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.633064 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"4fa54a9d515bc84f9ecca309869484bab5084adea8f3be23f28288ae9901629b"} Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.666828 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d00a5123-088f-4681-81b0-89706e0cb7a8-internal-tls-certs\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.666889 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d00a5123-088f-4681-81b0-89706e0cb7a8-combined-ca-bundle\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.666969 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfjjj\" (UniqueName: \"kubernetes.io/projected/d00a5123-088f-4681-81b0-89706e0cb7a8-kube-api-access-xfjjj\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.667000 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d00a5123-088f-4681-81b0-89706e0cb7a8-public-tls-certs\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.667023 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d00a5123-088f-4681-81b0-89706e0cb7a8-logs\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.667039 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d00a5123-088f-4681-81b0-89706e0cb7a8-config-data\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.667095 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d00a5123-088f-4681-81b0-89706e0cb7a8-config-data-custom\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.772945 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfjjj\" (UniqueName: \"kubernetes.io/projected/d00a5123-088f-4681-81b0-89706e0cb7a8-kube-api-access-xfjjj\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.773025 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d00a5123-088f-4681-81b0-89706e0cb7a8-public-tls-certs\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.773056 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d00a5123-088f-4681-81b0-89706e0cb7a8-logs\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.773075 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d00a5123-088f-4681-81b0-89706e0cb7a8-config-data\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.773160 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d00a5123-088f-4681-81b0-89706e0cb7a8-config-data-custom\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.773318 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d00a5123-088f-4681-81b0-89706e0cb7a8-internal-tls-certs\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.773353 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d00a5123-088f-4681-81b0-89706e0cb7a8-combined-ca-bundle\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.778853 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d00a5123-088f-4681-81b0-89706e0cb7a8-logs\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.802960 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d00a5123-088f-4681-81b0-89706e0cb7a8-internal-tls-certs\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.803274 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d00a5123-088f-4681-81b0-89706e0cb7a8-public-tls-certs\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.803121 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d00a5123-088f-4681-81b0-89706e0cb7a8-combined-ca-bundle\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.813101 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfjjj\" (UniqueName: \"kubernetes.io/projected/d00a5123-088f-4681-81b0-89706e0cb7a8-kube-api-access-xfjjj\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.814150 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d00a5123-088f-4681-81b0-89706e0cb7a8-config-data\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.814725 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d00a5123-088f-4681-81b0-89706e0cb7a8-config-data-custom\") pod \"barbican-api-c879c8666-gfj6k\" (UID: \"d00a5123-088f-4681-81b0-89706e0cb7a8\") " pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:38 crc kubenswrapper[4907]: I1129 14:51:38.882229 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:39 crc kubenswrapper[4907]: E1129 14:51:39.480323 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0a84a40_6906_41df_8af5_ba4c16a2c322.slice/crio-conmon-1b7702135179249a1b7acc97dda04658e6c200182057c477fe9b98a731c287a6.scope\": RecentStats: unable to find data in memory cache]" Nov 29 14:51:39 crc kubenswrapper[4907]: I1129 14:51:39.584366 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c879c8666-gfj6k"] Nov 29 14:51:39 crc kubenswrapper[4907]: I1129 14:51:39.648063 4907 generic.go:334] "Generic (PLEG): container finished" podID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerID="1b7702135179249a1b7acc97dda04658e6c200182057c477fe9b98a731c287a6" exitCode=1 Nov 29 14:51:39 crc kubenswrapper[4907]: I1129 14:51:39.648168 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bf6df8b58-8kpjs" event={"ID":"e0a84a40-6906-41df-8af5-ba4c16a2c322","Type":"ContainerDied","Data":"1b7702135179249a1b7acc97dda04658e6c200182057c477fe9b98a731c287a6"} Nov 29 14:51:39 crc kubenswrapper[4907]: I1129 14:51:39.648999 4907 scope.go:117] "RemoveContainer" containerID="1b7702135179249a1b7acc97dda04658e6c200182057c477fe9b98a731c287a6" Nov 29 14:51:39 crc kubenswrapper[4907]: I1129 14:51:39.650537 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b638c730-93e5-475d-afd6-b1c83c3e4952","Type":"ContainerStarted","Data":"055477b9a1e16f2b062a2b014f3463bf1912746d19f695a05a3cdf41f9d3f1bf"} Nov 29 14:51:39 crc kubenswrapper[4907]: I1129 14:51:39.653896 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b49cb25-4ca7-4382-b378-749bf7081894","Type":"ContainerStarted","Data":"2df3e3d52990970ce9e869c42b9bcf5b24cd86a53fa741f9a60d8153a701cbd4"} Nov 29 14:51:39 crc kubenswrapper[4907]: I1129 14:51:39.656428 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" event={"ID":"f003b36d-c2d9-432a-888b-2779c4b15bf3","Type":"ContainerStarted","Data":"e664ab446ed4c9d62cd5906dcab549cbb5a3fdcd910eadb1e2bc2b898de07256"} Nov 29 14:51:39 crc kubenswrapper[4907]: I1129 14:51:39.656612 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:39 crc kubenswrapper[4907]: I1129 14:51:39.706737 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe027ad6-8a24-44b5-8bfb-732d5c8fe22a","Type":"ContainerStarted","Data":"630ede802514a19ebe7e5b2d1f9e4353735851b0566040bbb4d3d51b8d14ca60"} Nov 29 14:51:39 crc kubenswrapper[4907]: I1129 14:51:39.708278 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.708250009 podStartE2EDuration="5.708250009s" podCreationTimestamp="2025-11-29 14:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:39.698264634 +0000 UTC m=+1397.685102306" watchObservedRunningTime="2025-11-29 14:51:39.708250009 +0000 UTC m=+1397.695087661" Nov 29 14:51:39 crc kubenswrapper[4907]: I1129 14:51:39.744022 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.743993638 podStartE2EDuration="5.743993638s" podCreationTimestamp="2025-11-29 14:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:39.722016631 +0000 UTC m=+1397.708854283" watchObservedRunningTime="2025-11-29 14:51:39.743993638 +0000 UTC m=+1397.730831290" Nov 29 14:51:39 crc kubenswrapper[4907]: I1129 14:51:39.766630 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" podStartSLOduration=5.766612382 podStartE2EDuration="5.766612382s" podCreationTimestamp="2025-11-29 14:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:39.740170869 +0000 UTC m=+1397.727008511" watchObservedRunningTime="2025-11-29 14:51:39.766612382 +0000 UTC m=+1397.753450024" Nov 29 14:51:39 crc kubenswrapper[4907]: I1129 14:51:39.785611 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=42.005123469 podStartE2EDuration="1m41.785586243s" podCreationTimestamp="2025-11-29 14:49:58 +0000 UTC" firstStartedPulling="2025-11-29 14:50:33.345092142 +0000 UTC m=+1331.331929794" lastFinishedPulling="2025-11-29 14:51:33.125554926 +0000 UTC m=+1391.112392568" observedRunningTime="2025-11-29 14:51:39.77251269 +0000 UTC m=+1397.759350342" watchObservedRunningTime="2025-11-29 14:51:39.785586243 +0000 UTC m=+1397.772423895" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.051044 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-t4bvb"] Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.082867 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-wgn84"] Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.085015 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.086775 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.102157 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-wgn84"] Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.224807 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-dns-svc\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.224857 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.224928 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.224986 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r576j\" (UniqueName: \"kubernetes.io/projected/757103ec-cde8-4a4d-9114-bb121bfe7ffa-kube-api-access-r576j\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.225048 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.225085 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-config\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.327348 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-dns-svc\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.327667 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.327790 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-dns-svc\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.328546 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.328701 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.329695 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.329846 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r576j\" (UniqueName: \"kubernetes.io/projected/757103ec-cde8-4a4d-9114-bb121bfe7ffa-kube-api-access-r576j\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.330310 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.331156 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.331219 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-config\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.332050 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-config\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.353335 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r576j\" (UniqueName: \"kubernetes.io/projected/757103ec-cde8-4a4d-9114-bb121bfe7ffa-kube-api-access-r576j\") pod \"dnsmasq-dns-85ff748b95-wgn84\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.435556 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:40 crc kubenswrapper[4907]: I1129 14:51:40.727401 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c879c8666-gfj6k" event={"ID":"d00a5123-088f-4681-81b0-89706e0cb7a8","Type":"ContainerStarted","Data":"5bbee62d8d082b4163b18bc1cd41857633549f234089edcb1325447a4dea6cd8"} Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.349912 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-wgn84"] Nov 29 14:51:41 crc kubenswrapper[4907]: W1129 14:51:41.351353 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod757103ec_cde8_4a4d_9114_bb121bfe7ffa.slice/crio-35fa6811d2422c0ce1bf25546ab1c9c68c3adb3826ef52129714b9e9d4534658 WatchSource:0}: Error finding container 35fa6811d2422c0ce1bf25546ab1c9c68c3adb3826ef52129714b9e9d4534658: Status 404 returned error can't find the container with id 35fa6811d2422c0ce1bf25546ab1c9c68c3adb3826ef52129714b9e9d4534658 Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.590730 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.748759 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bbc8f7595-4cqhq" event={"ID":"27887b0e-b017-4255-a5db-817cc7142898","Type":"ContainerStarted","Data":"3d5d577d4ed95dcb7f7ea6cd6d9f7197578873ed098af4a52bb169f67446bbc9"} Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.748830 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bbc8f7595-4cqhq" event={"ID":"27887b0e-b017-4255-a5db-817cc7142898","Type":"ContainerStarted","Data":"2dc074cd1a66d4c388643ac746a027cb9a7a9ceeade63480629624bc45297b85"} Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.750842 4907 generic.go:334] "Generic (PLEG): container finished" podID="757103ec-cde8-4a4d-9114-bb121bfe7ffa" containerID="0b435a3ead7b9e95eeaa714473bcedbe489950e7229f334a5f08470644d91768" exitCode=0 Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.750891 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-wgn84" event={"ID":"757103ec-cde8-4a4d-9114-bb121bfe7ffa","Type":"ContainerDied","Data":"0b435a3ead7b9e95eeaa714473bcedbe489950e7229f334a5f08470644d91768"} Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.750908 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-wgn84" event={"ID":"757103ec-cde8-4a4d-9114-bb121bfe7ffa","Type":"ContainerStarted","Data":"35fa6811d2422c0ce1bf25546ab1c9c68c3adb3826ef52129714b9e9d4534658"} Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.754808 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c879c8666-gfj6k" event={"ID":"d00a5123-088f-4681-81b0-89706e0cb7a8","Type":"ContainerStarted","Data":"152aa2851737c23ec0e1040305158d4ebfd32451ca9552ca06bd7035c17fabaf"} Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.754863 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c879c8666-gfj6k" event={"ID":"d00a5123-088f-4681-81b0-89706e0cb7a8","Type":"ContainerStarted","Data":"9e04617667e20cf442ddb9f2d7b8644464195bb625acca5d599195d3d439031c"} Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.754881 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.754910 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.770782 4907 generic.go:334] "Generic (PLEG): container finished" podID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerID="568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d" exitCode=1 Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.770893 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bf6df8b58-8kpjs" event={"ID":"e0a84a40-6906-41df-8af5-ba4c16a2c322","Type":"ContainerDied","Data":"568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d"} Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.770934 4907 scope.go:117] "RemoveContainer" containerID="1b7702135179249a1b7acc97dda04658e6c200182057c477fe9b98a731c287a6" Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.771802 4907 scope.go:117] "RemoveContainer" containerID="568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d" Nov 29 14:51:41 crc kubenswrapper[4907]: E1129 14:51:41.775403 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=barbican-api pod=barbican-api-5bf6df8b58-8kpjs_openstack(e0a84a40-6906-41df-8af5-ba4c16a2c322)\"" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.780568 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" event={"ID":"f8bb23b2-9a00-4098-b349-ac5221a0d305","Type":"ContainerStarted","Data":"f704e45cab4d00ec0fe9e0912b457b049331ad5df1af002653db16bbe307a217"} Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.780616 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" event={"ID":"f8bb23b2-9a00-4098-b349-ac5221a0d305","Type":"ContainerStarted","Data":"87565835a9cc7ffe42b6c03b831c905fc9b6bb8d60af5d7b78d65112c2652c54"} Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.780636 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" podUID="f003b36d-c2d9-432a-888b-2779c4b15bf3" containerName="dnsmasq-dns" containerID="cri-o://e664ab446ed4c9d62cd5906dcab549cbb5a3fdcd910eadb1e2bc2b898de07256" gracePeriod=10 Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.785580 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-bbc8f7595-4cqhq" podStartSLOduration=3.6260661450000002 podStartE2EDuration="7.785562845s" podCreationTimestamp="2025-11-29 14:51:34 +0000 UTC" firstStartedPulling="2025-11-29 14:51:36.697115549 +0000 UTC m=+1394.683953201" lastFinishedPulling="2025-11-29 14:51:40.856612249 +0000 UTC m=+1398.843449901" observedRunningTime="2025-11-29 14:51:41.771867935 +0000 UTC m=+1399.758705587" watchObservedRunningTime="2025-11-29 14:51:41.785562845 +0000 UTC m=+1399.772400497" Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.884819 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-c879c8666-gfj6k" podStartSLOduration=3.884797613 podStartE2EDuration="3.884797613s" podCreationTimestamp="2025-11-29 14:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:41.8580058 +0000 UTC m=+1399.844843452" watchObservedRunningTime="2025-11-29 14:51:41.884797613 +0000 UTC m=+1399.871635265" Nov 29 14:51:41 crc kubenswrapper[4907]: I1129 14:51:41.901843 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-64d7c8d644-rz2mz" podStartSLOduration=4.173784805 podStartE2EDuration="7.901820018s" podCreationTimestamp="2025-11-29 14:51:34 +0000 UTC" firstStartedPulling="2025-11-29 14:51:37.138732915 +0000 UTC m=+1395.125570567" lastFinishedPulling="2025-11-29 14:51:40.866768128 +0000 UTC m=+1398.853605780" observedRunningTime="2025-11-29 14:51:41.895000724 +0000 UTC m=+1399.881838396" watchObservedRunningTime="2025-11-29 14:51:41.901820018 +0000 UTC m=+1399.888657670" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.459332 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.528883 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-ovsdbserver-sb\") pod \"f003b36d-c2d9-432a-888b-2779c4b15bf3\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.528939 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-dns-svc\") pod \"f003b36d-c2d9-432a-888b-2779c4b15bf3\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.528966 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-config\") pod \"f003b36d-c2d9-432a-888b-2779c4b15bf3\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.529125 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5xs9\" (UniqueName: \"kubernetes.io/projected/f003b36d-c2d9-432a-888b-2779c4b15bf3-kube-api-access-m5xs9\") pod \"f003b36d-c2d9-432a-888b-2779c4b15bf3\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.529220 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-ovsdbserver-nb\") pod \"f003b36d-c2d9-432a-888b-2779c4b15bf3\" (UID: \"f003b36d-c2d9-432a-888b-2779c4b15bf3\") " Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.598726 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f003b36d-c2d9-432a-888b-2779c4b15bf3-kube-api-access-m5xs9" (OuterVolumeSpecName: "kube-api-access-m5xs9") pod "f003b36d-c2d9-432a-888b-2779c4b15bf3" (UID: "f003b36d-c2d9-432a-888b-2779c4b15bf3"). InnerVolumeSpecName "kube-api-access-m5xs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.636869 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5xs9\" (UniqueName: \"kubernetes.io/projected/f003b36d-c2d9-432a-888b-2779c4b15bf3-kube-api-access-m5xs9\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.753145 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f003b36d-c2d9-432a-888b-2779c4b15bf3" (UID: "f003b36d-c2d9-432a-888b-2779c4b15bf3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.759198 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f003b36d-c2d9-432a-888b-2779c4b15bf3" (UID: "f003b36d-c2d9-432a-888b-2779c4b15bf3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.790294 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f003b36d-c2d9-432a-888b-2779c4b15bf3" (UID: "f003b36d-c2d9-432a-888b-2779c4b15bf3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.793146 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-config" (OuterVolumeSpecName: "config") pod "f003b36d-c2d9-432a-888b-2779c4b15bf3" (UID: "f003b36d-c2d9-432a-888b-2779c4b15bf3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.796877 4907 scope.go:117] "RemoveContainer" containerID="568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.798512 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-wgn84" event={"ID":"757103ec-cde8-4a4d-9114-bb121bfe7ffa","Type":"ContainerStarted","Data":"95ee5d767931b90a870d070e91b9f2c540f1ce1c95a215f2007367980a37cf99"} Nov 29 14:51:42 crc kubenswrapper[4907]: E1129 14:51:42.799830 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=barbican-api pod=barbican-api-5bf6df8b58-8kpjs_openstack(e0a84a40-6906-41df-8af5-ba4c16a2c322)\"" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.802515 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.196:9311/healthcheck\": dial tcp 10.217.0.196:9311: connect: connection refused" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.805665 4907 generic.go:334] "Generic (PLEG): container finished" podID="f003b36d-c2d9-432a-888b-2779c4b15bf3" containerID="e664ab446ed4c9d62cd5906dcab549cbb5a3fdcd910eadb1e2bc2b898de07256" exitCode=0 Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.806957 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.808129 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" event={"ID":"f003b36d-c2d9-432a-888b-2779c4b15bf3","Type":"ContainerDied","Data":"e664ab446ed4c9d62cd5906dcab549cbb5a3fdcd910eadb1e2bc2b898de07256"} Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.808176 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-t4bvb" event={"ID":"f003b36d-c2d9-432a-888b-2779c4b15bf3","Type":"ContainerDied","Data":"196aac2365943821866ca81cb26e8b31876e14e166eddf50f8cd67e8bc6ae02a"} Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.808198 4907 scope.go:117] "RemoveContainer" containerID="e664ab446ed4c9d62cd5906dcab549cbb5a3fdcd910eadb1e2bc2b898de07256" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.839555 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-wgn84" podStartSLOduration=2.839534313 podStartE2EDuration="2.839534313s" podCreationTimestamp="2025-11-29 14:51:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:42.834206962 +0000 UTC m=+1400.821044624" watchObservedRunningTime="2025-11-29 14:51:42.839534313 +0000 UTC m=+1400.826371965" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.841958 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.850609 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.850711 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.850794 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f003b36d-c2d9-432a-888b-2779c4b15bf3-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.873294 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-t4bvb"] Nov 29 14:51:42 crc kubenswrapper[4907]: I1129 14:51:42.883050 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-t4bvb"] Nov 29 14:51:43 crc kubenswrapper[4907]: I1129 14:51:43.822007 4907 generic.go:334] "Generic (PLEG): container finished" podID="1e3e611f-5e4c-4b2c-baea-5f74745f315b" containerID="cb28a6d071c6ecb64928ca29f3a0ece93cd1195fca05b49b9612aa48dcedc1ed" exitCode=0 Nov 29 14:51:43 crc kubenswrapper[4907]: I1129 14:51:43.823175 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-5vflp" event={"ID":"1e3e611f-5e4c-4b2c-baea-5f74745f315b","Type":"ContainerDied","Data":"cb28a6d071c6ecb64928ca29f3a0ece93cd1195fca05b49b9612aa48dcedc1ed"} Nov 29 14:51:43 crc kubenswrapper[4907]: I1129 14:51:43.823221 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:44 crc kubenswrapper[4907]: I1129 14:51:44.494603 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f003b36d-c2d9-432a-888b-2779c4b15bf3" path="/var/lib/kubelet/pods/f003b36d-c2d9-432a-888b-2779c4b15bf3/volumes" Nov 29 14:51:44 crc kubenswrapper[4907]: I1129 14:51:44.590235 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:44 crc kubenswrapper[4907]: I1129 14:51:44.590833 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.196:9311/healthcheck\": dial tcp 10.217.0.196:9311: connect: connection refused" Nov 29 14:51:44 crc kubenswrapper[4907]: I1129 14:51:44.590876 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.196:9311/healthcheck\": dial tcp 10.217.0.196:9311: connect: connection refused" Nov 29 14:51:44 crc kubenswrapper[4907]: I1129 14:51:44.591159 4907 scope.go:117] "RemoveContainer" containerID="568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d" Nov 29 14:51:44 crc kubenswrapper[4907]: E1129 14:51:44.591413 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=barbican-api pod=barbican-api-5bf6df8b58-8kpjs_openstack(e0a84a40-6906-41df-8af5-ba4c16a2c322)\"" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" Nov 29 14:51:44 crc kubenswrapper[4907]: I1129 14:51:44.836948 4907 generic.go:334] "Generic (PLEG): container finished" podID="d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de" containerID="39a5a1f7616453bb870e47a8e4ac7c7bfe9eb960884396bf0b6e3d37581774be" exitCode=0 Nov 29 14:51:44 crc kubenswrapper[4907]: I1129 14:51:44.837047 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8lc4r" event={"ID":"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de","Type":"ContainerDied","Data":"39a5a1f7616453bb870e47a8e4ac7c7bfe9eb960884396bf0b6e3d37581774be"} Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.282239 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.282625 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.313294 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.327941 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.381455 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.381502 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.418958 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.453179 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.590630 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.591657 4907 scope.go:117] "RemoveContainer" containerID="568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d" Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.591770 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.196:9311/healthcheck\": dial tcp 10.217.0.196:9311: connect: connection refused" Nov 29 14:51:45 crc kubenswrapper[4907]: E1129 14:51:45.592056 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=barbican-api pod=barbican-api-5bf6df8b58-8kpjs_openstack(e0a84a40-6906-41df-8af5-ba4c16a2c322)\"" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.592514 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.196:9311/healthcheck\": dial tcp 10.217.0.196:9311: connect: connection refused" Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.852163 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.852230 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.852254 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:45 crc kubenswrapper[4907]: I1129 14:51:45.852273 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.590721 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.196:9311/healthcheck\": dial tcp 10.217.0.196:9311: connect: connection refused" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.656989 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-5vflp" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.662933 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.777953 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvrdk\" (UniqueName: \"kubernetes.io/projected/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-kube-api-access-jvrdk\") pod \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.778108 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-scripts\") pod \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.778148 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-combined-ca-bundle\") pod \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.778195 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ltf5\" (UniqueName: \"kubernetes.io/projected/1e3e611f-5e4c-4b2c-baea-5f74745f315b-kube-api-access-9ltf5\") pod \"1e3e611f-5e4c-4b2c-baea-5f74745f315b\" (UID: \"1e3e611f-5e4c-4b2c-baea-5f74745f315b\") " Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.778222 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-etc-machine-id\") pod \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.778347 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e3e611f-5e4c-4b2c-baea-5f74745f315b-combined-ca-bundle\") pod \"1e3e611f-5e4c-4b2c-baea-5f74745f315b\" (UID: \"1e3e611f-5e4c-4b2c-baea-5f74745f315b\") " Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.778373 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e3e611f-5e4c-4b2c-baea-5f74745f315b-config-data\") pod \"1e3e611f-5e4c-4b2c-baea-5f74745f315b\" (UID: \"1e3e611f-5e4c-4b2c-baea-5f74745f315b\") " Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.778406 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-config-data\") pod \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.778515 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-db-sync-config-data\") pod \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\" (UID: \"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de\") " Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.780213 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de" (UID: "d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.785777 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de" (UID: "d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.788975 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-kube-api-access-jvrdk" (OuterVolumeSpecName: "kube-api-access-jvrdk") pod "d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de" (UID: "d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de"). InnerVolumeSpecName "kube-api-access-jvrdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.790091 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e3e611f-5e4c-4b2c-baea-5f74745f315b-kube-api-access-9ltf5" (OuterVolumeSpecName: "kube-api-access-9ltf5") pod "1e3e611f-5e4c-4b2c-baea-5f74745f315b" (UID: "1e3e611f-5e4c-4b2c-baea-5f74745f315b"). InnerVolumeSpecName "kube-api-access-9ltf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.809240 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-scripts" (OuterVolumeSpecName: "scripts") pod "d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de" (UID: "d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.820365 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e3e611f-5e4c-4b2c-baea-5f74745f315b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e3e611f-5e4c-4b2c-baea-5f74745f315b" (UID: "1e3e611f-5e4c-4b2c-baea-5f74745f315b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.832863 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de" (UID: "d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.872423 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8lc4r" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.872415 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8lc4r" event={"ID":"d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de","Type":"ContainerDied","Data":"b78c3106b9c871c610e3f0abdd66d771187193ceaba843801c1d35e0cac031e2"} Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.872554 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b78c3106b9c871c610e3f0abdd66d771187193ceaba843801c1d35e0cac031e2" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.872713 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e3e611f-5e4c-4b2c-baea-5f74745f315b-config-data" (OuterVolumeSpecName: "config-data") pod "1e3e611f-5e4c-4b2c-baea-5f74745f315b" (UID: "1e3e611f-5e4c-4b2c-baea-5f74745f315b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.874617 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-5vflp" event={"ID":"1e3e611f-5e4c-4b2c-baea-5f74745f315b","Type":"ContainerDied","Data":"07f770c7fe3a0c5269fb3c051e3bf155278a48c4bf21f27bc8132761ab769df0"} Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.874680 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07f770c7fe3a0c5269fb3c051e3bf155278a48c4bf21f27bc8132761ab769df0" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.874654 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-5vflp" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.875454 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-config-data" (OuterVolumeSpecName: "config-data") pod "d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de" (UID: "d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.881143 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvrdk\" (UniqueName: \"kubernetes.io/projected/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-kube-api-access-jvrdk\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.881169 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.881180 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.881189 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ltf5\" (UniqueName: \"kubernetes.io/projected/1e3e611f-5e4c-4b2c-baea-5f74745f315b-kube-api-access-9ltf5\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.881197 4907 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.881205 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e3e611f-5e4c-4b2c-baea-5f74745f315b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.881213 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e3e611f-5e4c-4b2c-baea-5f74745f315b-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.881221 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:47 crc kubenswrapper[4907]: I1129 14:51:47.881229 4907 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:48 crc kubenswrapper[4907]: E1129 14:51:48.236030 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1e527c4_b1c6_4f80_a37c_a6fe6ab8c5de.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1e527c4_b1c6_4f80_a37c_a6fe6ab8c5de.slice/crio-b78c3106b9c871c610e3f0abdd66d771187193ceaba843801c1d35e0cac031e2\": RecentStats: unable to find data in memory cache]" Nov 29 14:51:48 crc kubenswrapper[4907]: E1129 14:51:48.236142 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1e527c4_b1c6_4f80_a37c_a6fe6ab8c5de.slice/crio-b78c3106b9c871c610e3f0abdd66d771187193ceaba843801c1d35e0cac031e2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1e527c4_b1c6_4f80_a37c_a6fe6ab8c5de.slice\": RecentStats: unable to find data in memory cache]" Nov 29 14:51:48 crc kubenswrapper[4907]: I1129 14:51:48.923720 4907 scope.go:117] "RemoveContainer" containerID="09451d9261ff6cbb66e0628b6890af9059287997a4be421373a5b22a56b671c5" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.057430 4907 scope.go:117] "RemoveContainer" containerID="e664ab446ed4c9d62cd5906dcab549cbb5a3fdcd910eadb1e2bc2b898de07256" Nov 29 14:51:49 crc kubenswrapper[4907]: E1129 14:51:49.059867 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e664ab446ed4c9d62cd5906dcab549cbb5a3fdcd910eadb1e2bc2b898de07256\": container with ID starting with e664ab446ed4c9d62cd5906dcab549cbb5a3fdcd910eadb1e2bc2b898de07256 not found: ID does not exist" containerID="e664ab446ed4c9d62cd5906dcab549cbb5a3fdcd910eadb1e2bc2b898de07256" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.062325 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e664ab446ed4c9d62cd5906dcab549cbb5a3fdcd910eadb1e2bc2b898de07256"} err="failed to get container status \"e664ab446ed4c9d62cd5906dcab549cbb5a3fdcd910eadb1e2bc2b898de07256\": rpc error: code = NotFound desc = could not find container \"e664ab446ed4c9d62cd5906dcab549cbb5a3fdcd910eadb1e2bc2b898de07256\": container with ID starting with e664ab446ed4c9d62cd5906dcab549cbb5a3fdcd910eadb1e2bc2b898de07256 not found: ID does not exist" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.062603 4907 scope.go:117] "RemoveContainer" containerID="09451d9261ff6cbb66e0628b6890af9059287997a4be421373a5b22a56b671c5" Nov 29 14:51:49 crc kubenswrapper[4907]: E1129 14:51:49.077239 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09451d9261ff6cbb66e0628b6890af9059287997a4be421373a5b22a56b671c5\": container with ID starting with 09451d9261ff6cbb66e0628b6890af9059287997a4be421373a5b22a56b671c5 not found: ID does not exist" containerID="09451d9261ff6cbb66e0628b6890af9059287997a4be421373a5b22a56b671c5" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.077517 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09451d9261ff6cbb66e0628b6890af9059287997a4be421373a5b22a56b671c5"} err="failed to get container status \"09451d9261ff6cbb66e0628b6890af9059287997a4be421373a5b22a56b671c5\": rpc error: code = NotFound desc = could not find container \"09451d9261ff6cbb66e0628b6890af9059287997a4be421373a5b22a56b671c5\": container with ID starting with 09451d9261ff6cbb66e0628b6890af9059287997a4be421373a5b22a56b671c5 not found: ID does not exist" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.137868 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 14:51:49 crc kubenswrapper[4907]: E1129 14:51:49.138429 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f003b36d-c2d9-432a-888b-2779c4b15bf3" containerName="init" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.138471 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f003b36d-c2d9-432a-888b-2779c4b15bf3" containerName="init" Nov 29 14:51:49 crc kubenswrapper[4907]: E1129 14:51:49.138493 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de" containerName="cinder-db-sync" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.138500 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de" containerName="cinder-db-sync" Nov 29 14:51:49 crc kubenswrapper[4907]: E1129 14:51:49.138512 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e3e611f-5e4c-4b2c-baea-5f74745f315b" containerName="heat-db-sync" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.138517 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e3e611f-5e4c-4b2c-baea-5f74745f315b" containerName="heat-db-sync" Nov 29 14:51:49 crc kubenswrapper[4907]: E1129 14:51:49.138554 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f003b36d-c2d9-432a-888b-2779c4b15bf3" containerName="dnsmasq-dns" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.138560 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f003b36d-c2d9-432a-888b-2779c4b15bf3" containerName="dnsmasq-dns" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.138774 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e3e611f-5e4c-4b2c-baea-5f74745f315b" containerName="heat-db-sync" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.138786 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f003b36d-c2d9-432a-888b-2779c4b15bf3" containerName="dnsmasq-dns" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.138806 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de" containerName="cinder-db-sync" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.161031 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.164836 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.165265 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.165515 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fqcjq" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.165649 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.203287 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.236775 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.236836 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-scripts\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.236861 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-config-data\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.236883 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94ed39fa-abc7-45e2-8919-43609b0bfd9e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.236929 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.237037 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78pqv\" (UniqueName: \"kubernetes.io/projected/94ed39fa-abc7-45e2-8919-43609b0bfd9e-kube-api-access-78pqv\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.297179 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-wgn84"] Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.297430 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-wgn84" podUID="757103ec-cde8-4a4d-9114-bb121bfe7ffa" containerName="dnsmasq-dns" containerID="cri-o://95ee5d767931b90a870d070e91b9f2c540f1ce1c95a215f2007367980a37cf99" gracePeriod=10 Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.299012 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.338457 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78pqv\" (UniqueName: \"kubernetes.io/projected/94ed39fa-abc7-45e2-8919-43609b0bfd9e-kube-api-access-78pqv\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.338823 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.338864 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-scripts\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.338883 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-config-data\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.338907 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94ed39fa-abc7-45e2-8919-43609b0bfd9e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.338949 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.341860 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94ed39fa-abc7-45e2-8919-43609b0bfd9e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.347360 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.355208 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.355275 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2gnn4"] Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.363791 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.366342 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78pqv\" (UniqueName: \"kubernetes.io/projected/94ed39fa-abc7-45e2-8919-43609b0bfd9e-kube-api-access-78pqv\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.366605 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-config-data\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.375183 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-scripts\") pod \"cinder-scheduler-0\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.410296 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2gnn4"] Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.449991 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.450089 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-config\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.450112 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlf5m\" (UniqueName: \"kubernetes.io/projected/74afd491-5339-485f-8a90-91f658a5f98e-kube-api-access-zlf5m\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.450140 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.450172 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.450256 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.517570 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.520915 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.531736 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.553273 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.553359 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2941047a-90ba-4b26-8678-174546f90d18-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.553406 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.553454 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v5st\" (UniqueName: \"kubernetes.io/projected/2941047a-90ba-4b26-8678-174546f90d18-kube-api-access-6v5st\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.553493 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.553525 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-scripts\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.553603 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-config\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.553673 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlf5m\" (UniqueName: \"kubernetes.io/projected/74afd491-5339-485f-8a90-91f658a5f98e-kube-api-access-zlf5m\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.553703 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-config-data\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.553725 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.553766 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.553785 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2941047a-90ba-4b26-8678-174546f90d18-logs\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.553816 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-config-data-custom\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.555274 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.555876 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.556514 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-config\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.558541 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.559032 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.584589 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.586262 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlf5m\" (UniqueName: \"kubernetes.io/projected/74afd491-5339-485f-8a90-91f658a5f98e-kube-api-access-zlf5m\") pod \"dnsmasq-dns-5c9776ccc5-2gnn4\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.615097 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.662709 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-config-data\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.662819 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2941047a-90ba-4b26-8678-174546f90d18-logs\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.662883 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-config-data-custom\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.663029 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.663056 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2941047a-90ba-4b26-8678-174546f90d18-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.663162 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v5st\" (UniqueName: \"kubernetes.io/projected/2941047a-90ba-4b26-8678-174546f90d18-kube-api-access-6v5st\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.663252 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-scripts\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.667992 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2941047a-90ba-4b26-8678-174546f90d18-logs\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.693063 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-config-data-custom\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.694586 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2941047a-90ba-4b26-8678-174546f90d18-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.695312 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-scripts\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.698175 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-config-data\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.703705 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.710074 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v5st\" (UniqueName: \"kubernetes.io/projected/2941047a-90ba-4b26-8678-174546f90d18-kube-api-access-6v5st\") pod \"cinder-api-0\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.836013 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.864000 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 14:51:49 crc kubenswrapper[4907]: E1129 14:51:49.894068 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.921123 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7c3295c-d537-4302-80c1-ce39f0f4fcb4","Type":"ContainerStarted","Data":"bb5021954c0564f02f26a1e397494dbee334ab66f26e5cd69e9f1ab394875cff"} Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.921278 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerName="ceilometer-notification-agent" containerID="cri-o://5bb03f96ea58f6710f35fa0e0f2b34eaf307facac712b36e63bcc71d9ab89995" gracePeriod=30 Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.921358 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.921752 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerName="proxy-httpd" containerID="cri-o://bb5021954c0564f02f26a1e397494dbee334ab66f26e5cd69e9f1ab394875cff" gracePeriod=30 Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.921800 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerName="sg-core" containerID="cri-o://93978f213f0dd7390dc158c33afad7e0b5684fa268d23626f48362b017a603ae" gracePeriod=30 Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.954680 4907 generic.go:334] "Generic (PLEG): container finished" podID="757103ec-cde8-4a4d-9114-bb121bfe7ffa" containerID="95ee5d767931b90a870d070e91b9f2c540f1ce1c95a215f2007367980a37cf99" exitCode=0 Nov 29 14:51:49 crc kubenswrapper[4907]: I1129 14:51:49.954869 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-wgn84" event={"ID":"757103ec-cde8-4a4d-9114-bb121bfe7ffa","Type":"ContainerDied","Data":"95ee5d767931b90a870d070e91b9f2c540f1ce1c95a215f2007367980a37cf99"} Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.148781 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.182281 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-ovsdbserver-nb\") pod \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.182661 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r576j\" (UniqueName: \"kubernetes.io/projected/757103ec-cde8-4a4d-9114-bb121bfe7ffa-kube-api-access-r576j\") pod \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.183493 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-dns-svc\") pod \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.183531 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-config\") pod \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.183668 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-ovsdbserver-sb\") pod \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.183753 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-dns-swift-storage-0\") pod \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\" (UID: \"757103ec-cde8-4a4d-9114-bb121bfe7ffa\") " Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.200637 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.232693 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757103ec-cde8-4a4d-9114-bb121bfe7ffa-kube-api-access-r576j" (OuterVolumeSpecName: "kube-api-access-r576j") pod "757103ec-cde8-4a4d-9114-bb121bfe7ffa" (UID: "757103ec-cde8-4a4d-9114-bb121bfe7ffa"). InnerVolumeSpecName "kube-api-access-r576j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.286947 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r576j\" (UniqueName: \"kubernetes.io/projected/757103ec-cde8-4a4d-9114-bb121bfe7ffa-kube-api-access-r576j\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.310114 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "757103ec-cde8-4a4d-9114-bb121bfe7ffa" (UID: "757103ec-cde8-4a4d-9114-bb121bfe7ffa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.379925 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "757103ec-cde8-4a4d-9114-bb121bfe7ffa" (UID: "757103ec-cde8-4a4d-9114-bb121bfe7ffa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.393029 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.393070 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.436562 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.469735 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-config" (OuterVolumeSpecName: "config") pod "757103ec-cde8-4a4d-9114-bb121bfe7ffa" (UID: "757103ec-cde8-4a4d-9114-bb121bfe7ffa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.477879 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "757103ec-cde8-4a4d-9114-bb121bfe7ffa" (UID: "757103ec-cde8-4a4d-9114-bb121bfe7ffa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.500894 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.500945 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.503019 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "757103ec-cde8-4a4d-9114-bb121bfe7ffa" (UID: "757103ec-cde8-4a4d-9114-bb121bfe7ffa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.592023 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.196:9311/healthcheck\": dial tcp 10.217.0.196:9311: connect: connection refused" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.592098 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.592946 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="barbican-api-log" containerStatusID={"Type":"cri-o","ID":"4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a"} pod="openstack/barbican-api-5bf6df8b58-8kpjs" containerMessage="Container barbican-api-log failed liveness probe, will be restarted" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.592971 4907 scope.go:117] "RemoveContainer" containerID="568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.592992 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" containerID="cri-o://4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a" gracePeriod=30 Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.593579 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.196:9311/healthcheck\": dial tcp 10.217.0.196:9311: connect: connection refused" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.594103 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.196:9311/healthcheck\": dial tcp 10.217.0.196:9311: connect: connection refused" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.603547 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757103ec-cde8-4a4d-9114-bb121bfe7ffa-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:50 crc kubenswrapper[4907]: I1129 14:51:50.785502 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2gnn4"] Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.001230 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.005256 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" event={"ID":"74afd491-5339-485f-8a90-91f658a5f98e","Type":"ContainerStarted","Data":"af9d3be9fb96a1ae53e10934ea743891fba47e7d3267d54159cd00b19065e73b"} Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.005401 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.007898 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.012334 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"94ed39fa-abc7-45e2-8919-43609b0bfd9e","Type":"ContainerStarted","Data":"d55703ac9173502f31a62d482f92121a7b22f0e2cdb18764cbd007a18f93fef6"} Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.026161 4907 generic.go:334] "Generic (PLEG): container finished" podID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerID="93978f213f0dd7390dc158c33afad7e0b5684fa268d23626f48362b017a603ae" exitCode=2 Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.026245 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7c3295c-d537-4302-80c1-ce39f0f4fcb4","Type":"ContainerDied","Data":"93978f213f0dd7390dc158c33afad7e0b5684fa268d23626f48362b017a603ae"} Nov 29 14:51:51 crc kubenswrapper[4907]: E1129 14:51:51.072089 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=barbican-api pod=barbican-api-5bf6df8b58-8kpjs_openstack(e0a84a40-6906-41df-8af5-ba4c16a2c322)\"" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.076097 4907 generic.go:334] "Generic (PLEG): container finished" podID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerID="4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a" exitCode=143 Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.076183 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bf6df8b58-8kpjs" event={"ID":"e0a84a40-6906-41df-8af5-ba4c16a2c322","Type":"ContainerDied","Data":"4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a"} Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.092778 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.092879 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.146770 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-wgn84" event={"ID":"757103ec-cde8-4a4d-9114-bb121bfe7ffa","Type":"ContainerDied","Data":"35fa6811d2422c0ce1bf25546ab1c9c68c3adb3826ef52129714b9e9d4534658"} Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.147192 4907 scope.go:117] "RemoveContainer" containerID="95ee5d767931b90a870d070e91b9f2c540f1ce1c95a215f2007367980a37cf99" Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.148959 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-wgn84" Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.219920 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.366788 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-wgn84"] Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.386098 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.387214 4907 scope.go:117] "RemoveContainer" containerID="0b435a3ead7b9e95eeaa714473bcedbe489950e7229f334a5f08470644d91768" Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.391941 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-wgn84"] Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.535391 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:51 crc kubenswrapper[4907]: I1129 14:51:51.676550 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 29 14:51:52 crc kubenswrapper[4907]: I1129 14:51:52.176201 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c879c8666-gfj6k" Nov 29 14:51:52 crc kubenswrapper[4907]: I1129 14:51:52.194173 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bf6df8b58-8kpjs" event={"ID":"e0a84a40-6906-41df-8af5-ba4c16a2c322","Type":"ContainerStarted","Data":"c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803"} Nov 29 14:51:52 crc kubenswrapper[4907]: I1129 14:51:52.194303 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:52 crc kubenswrapper[4907]: I1129 14:51:52.194817 4907 scope.go:117] "RemoveContainer" containerID="568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d" Nov 29 14:51:52 crc kubenswrapper[4907]: I1129 14:51:52.210638 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2941047a-90ba-4b26-8678-174546f90d18","Type":"ContainerStarted","Data":"cc31111472cfc86ae069eaed0a3c0f0282f4be6055ba1bce5670509d6fe0cda7"} Nov 29 14:51:52 crc kubenswrapper[4907]: I1129 14:51:52.245533 4907 generic.go:334] "Generic (PLEG): container finished" podID="74afd491-5339-485f-8a90-91f658a5f98e" containerID="6b6ad7c067dc71c1271a4723cfb266c0268f720ee6a7f89bc1b51c4f986103d9" exitCode=0 Nov 29 14:51:52 crc kubenswrapper[4907]: I1129 14:51:52.245658 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" event={"ID":"74afd491-5339-485f-8a90-91f658a5f98e","Type":"ContainerDied","Data":"6b6ad7c067dc71c1271a4723cfb266c0268f720ee6a7f89bc1b51c4f986103d9"} Nov 29 14:51:52 crc kubenswrapper[4907]: I1129 14:51:52.281282 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5bf6df8b58-8kpjs"] Nov 29 14:51:52 crc kubenswrapper[4907]: I1129 14:51:52.516395 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="757103ec-cde8-4a4d-9114-bb121bfe7ffa" path="/var/lib/kubelet/pods/757103ec-cde8-4a4d-9114-bb121bfe7ffa/volumes" Nov 29 14:51:53 crc kubenswrapper[4907]: I1129 14:51:53.267236 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2941047a-90ba-4b26-8678-174546f90d18","Type":"ContainerStarted","Data":"1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a"} Nov 29 14:51:53 crc kubenswrapper[4907]: I1129 14:51:53.270982 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" event={"ID":"74afd491-5339-485f-8a90-91f658a5f98e","Type":"ContainerStarted","Data":"3ed07bdb3db1e6d5e8c30418a10bb6df7ef82641a7b2bfb6911c24ef9bd299af"} Nov 29 14:51:53 crc kubenswrapper[4907]: I1129 14:51:53.271659 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:53 crc kubenswrapper[4907]: I1129 14:51:53.281845 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"94ed39fa-abc7-45e2-8919-43609b0bfd9e","Type":"ContainerStarted","Data":"6e8b95828aad3a7fdb08f64e16a61ebe38c796b7443256c1c1822e154094d69f"} Nov 29 14:51:53 crc kubenswrapper[4907]: I1129 14:51:53.284890 4907 generic.go:334] "Generic (PLEG): container finished" podID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerID="5bb03f96ea58f6710f35fa0e0f2b34eaf307facac712b36e63bcc71d9ab89995" exitCode=0 Nov 29 14:51:53 crc kubenswrapper[4907]: I1129 14:51:53.284952 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7c3295c-d537-4302-80c1-ce39f0f4fcb4","Type":"ContainerDied","Data":"5bb03f96ea58f6710f35fa0e0f2b34eaf307facac712b36e63bcc71d9ab89995"} Nov 29 14:51:53 crc kubenswrapper[4907]: I1129 14:51:53.290597 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bf6df8b58-8kpjs" event={"ID":"e0a84a40-6906-41df-8af5-ba4c16a2c322","Type":"ContainerStarted","Data":"77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625"} Nov 29 14:51:53 crc kubenswrapper[4907]: I1129 14:51:53.290805 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" containerID="cri-o://c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803" gracePeriod=30 Nov 29 14:51:53 crc kubenswrapper[4907]: I1129 14:51:53.290817 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5bf6df8b58-8kpjs" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api" containerID="cri-o://77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625" gracePeriod=30 Nov 29 14:51:53 crc kubenswrapper[4907]: I1129 14:51:53.308320 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" podStartSLOduration=4.308301474 podStartE2EDuration="4.308301474s" podCreationTimestamp="2025-11-29 14:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:53.298478384 +0000 UTC m=+1411.285316036" watchObservedRunningTime="2025-11-29 14:51:53.308301474 +0000 UTC m=+1411.295139126" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.099789 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.184422 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-config-data\") pod \"e0a84a40-6906-41df-8af5-ba4c16a2c322\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.184882 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a84a40-6906-41df-8af5-ba4c16a2c322-logs\") pod \"e0a84a40-6906-41df-8af5-ba4c16a2c322\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.184981 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-combined-ca-bundle\") pod \"e0a84a40-6906-41df-8af5-ba4c16a2c322\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.185024 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-config-data-custom\") pod \"e0a84a40-6906-41df-8af5-ba4c16a2c322\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.185143 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgvkp\" (UniqueName: \"kubernetes.io/projected/e0a84a40-6906-41df-8af5-ba4c16a2c322-kube-api-access-mgvkp\") pod \"e0a84a40-6906-41df-8af5-ba4c16a2c322\" (UID: \"e0a84a40-6906-41df-8af5-ba4c16a2c322\") " Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.185331 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0a84a40-6906-41df-8af5-ba4c16a2c322-logs" (OuterVolumeSpecName: "logs") pod "e0a84a40-6906-41df-8af5-ba4c16a2c322" (UID: "e0a84a40-6906-41df-8af5-ba4c16a2c322"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.185624 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a84a40-6906-41df-8af5-ba4c16a2c322-logs\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.190290 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a84a40-6906-41df-8af5-ba4c16a2c322-kube-api-access-mgvkp" (OuterVolumeSpecName: "kube-api-access-mgvkp") pod "e0a84a40-6906-41df-8af5-ba4c16a2c322" (UID: "e0a84a40-6906-41df-8af5-ba4c16a2c322"). InnerVolumeSpecName "kube-api-access-mgvkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.190543 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e0a84a40-6906-41df-8af5-ba4c16a2c322" (UID: "e0a84a40-6906-41df-8af5-ba4c16a2c322"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.216707 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0a84a40-6906-41df-8af5-ba4c16a2c322" (UID: "e0a84a40-6906-41df-8af5-ba4c16a2c322"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.264912 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-config-data" (OuterVolumeSpecName: "config-data") pod "e0a84a40-6906-41df-8af5-ba4c16a2c322" (UID: "e0a84a40-6906-41df-8af5-ba4c16a2c322"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.287417 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.287459 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.287469 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgvkp\" (UniqueName: \"kubernetes.io/projected/e0a84a40-6906-41df-8af5-ba4c16a2c322-kube-api-access-mgvkp\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.287480 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a84a40-6906-41df-8af5-ba4c16a2c322-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.335428 4907 generic.go:334] "Generic (PLEG): container finished" podID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerID="77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625" exitCode=1 Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.335474 4907 generic.go:334] "Generic (PLEG): container finished" podID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerID="c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803" exitCode=143 Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.335514 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bf6df8b58-8kpjs" event={"ID":"e0a84a40-6906-41df-8af5-ba4c16a2c322","Type":"ContainerDied","Data":"77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625"} Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.335540 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bf6df8b58-8kpjs" event={"ID":"e0a84a40-6906-41df-8af5-ba4c16a2c322","Type":"ContainerDied","Data":"c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803"} Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.335550 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bf6df8b58-8kpjs" event={"ID":"e0a84a40-6906-41df-8af5-ba4c16a2c322","Type":"ContainerDied","Data":"1971a4b096def06be1875e18535da5a9534a4166ecd592bc64ed558af9b87577"} Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.335566 4907 scope.go:117] "RemoveContainer" containerID="77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.335692 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bf6df8b58-8kpjs" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.347757 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2941047a-90ba-4b26-8678-174546f90d18","Type":"ContainerStarted","Data":"3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241"} Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.348127 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2941047a-90ba-4b26-8678-174546f90d18" containerName="cinder-api-log" containerID="cri-o://1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a" gracePeriod=30 Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.348497 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.350543 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2941047a-90ba-4b26-8678-174546f90d18" containerName="cinder-api" containerID="cri-o://3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241" gracePeriod=30 Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.378060 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"94ed39fa-abc7-45e2-8919-43609b0bfd9e","Type":"ContainerStarted","Data":"51f5d2a1ad3ca7d9742ee845c59c098a5b0ca0d55b018954590275768e6f885d"} Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.411581 4907 scope.go:117] "RemoveContainer" containerID="c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.414125 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.414111051 podStartE2EDuration="5.414111051s" podCreationTimestamp="2025-11-29 14:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:54.372501445 +0000 UTC m=+1412.359339097" watchObservedRunningTime="2025-11-29 14:51:54.414111051 +0000 UTC m=+1412.400948703" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.419914 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.363052874 podStartE2EDuration="5.419900356s" podCreationTimestamp="2025-11-29 14:51:49 +0000 UTC" firstStartedPulling="2025-11-29 14:51:50.43950717 +0000 UTC m=+1408.426344822" lastFinishedPulling="2025-11-29 14:51:51.496354652 +0000 UTC m=+1409.483192304" observedRunningTime="2025-11-29 14:51:54.392042062 +0000 UTC m=+1412.378879714" watchObservedRunningTime="2025-11-29 14:51:54.419900356 +0000 UTC m=+1412.406738008" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.435578 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5bf6df8b58-8kpjs"] Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.436092 4907 scope.go:117] "RemoveContainer" containerID="568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.444749 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5bf6df8b58-8kpjs"] Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.481527 4907 scope.go:117] "RemoveContainer" containerID="4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.496354 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" path="/var/lib/kubelet/pods/e0a84a40-6906-41df-8af5-ba4c16a2c322/volumes" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.516357 4907 scope.go:117] "RemoveContainer" containerID="77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625" Nov 29 14:51:54 crc kubenswrapper[4907]: E1129 14:51:54.516819 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625\": container with ID starting with 77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625 not found: ID does not exist" containerID="77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.516852 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625"} err="failed to get container status \"77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625\": rpc error: code = NotFound desc = could not find container \"77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625\": container with ID starting with 77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625 not found: ID does not exist" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.516876 4907 scope.go:117] "RemoveContainer" containerID="c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803" Nov 29 14:51:54 crc kubenswrapper[4907]: E1129 14:51:54.517970 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803\": container with ID starting with c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803 not found: ID does not exist" containerID="c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.517993 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803"} err="failed to get container status \"c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803\": rpc error: code = NotFound desc = could not find container \"c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803\": container with ID starting with c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803 not found: ID does not exist" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.518026 4907 scope.go:117] "RemoveContainer" containerID="568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d" Nov 29 14:51:54 crc kubenswrapper[4907]: E1129 14:51:54.518333 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d\": container with ID starting with 568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d not found: ID does not exist" containerID="568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.518352 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d"} err="failed to get container status \"568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d\": rpc error: code = NotFound desc = could not find container \"568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d\": container with ID starting with 568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d not found: ID does not exist" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.518364 4907 scope.go:117] "RemoveContainer" containerID="4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a" Nov 29 14:51:54 crc kubenswrapper[4907]: E1129 14:51:54.518628 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a\": container with ID starting with 4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a not found: ID does not exist" containerID="4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.518657 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a"} err="failed to get container status \"4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a\": rpc error: code = NotFound desc = could not find container \"4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a\": container with ID starting with 4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a not found: ID does not exist" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.518673 4907 scope.go:117] "RemoveContainer" containerID="77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.519099 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625"} err="failed to get container status \"77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625\": rpc error: code = NotFound desc = could not find container \"77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625\": container with ID starting with 77a26669123f25fc695d495f7000e74108366fa565e8a03d27e23b8cf34d8625 not found: ID does not exist" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.519139 4907 scope.go:117] "RemoveContainer" containerID="c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.519460 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803"} err="failed to get container status \"c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803\": rpc error: code = NotFound desc = could not find container \"c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803\": container with ID starting with c927175a1e34d34cc9088099f9a51d2f9956438f16b5832798e4171578d1a803 not found: ID does not exist" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.519476 4907 scope.go:117] "RemoveContainer" containerID="568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.520593 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d"} err="failed to get container status \"568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d\": rpc error: code = NotFound desc = could not find container \"568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d\": container with ID starting with 568d0a75e95dc5aa068f8622d608f8bcc09e7a3b97b52b73762502ad35b29d0d not found: ID does not exist" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.520635 4907 scope.go:117] "RemoveContainer" containerID="4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.521876 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a"} err="failed to get container status \"4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a\": rpc error: code = NotFound desc = could not find container \"4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a\": container with ID starting with 4f13de210d631039a3b496687153af54cbdbcf090eb3b59d3b20b770d6f1581a not found: ID does not exist" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.616257 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 29 14:51:54 crc kubenswrapper[4907]: I1129 14:51:54.955585 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.105291 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-scripts\") pod \"2941047a-90ba-4b26-8678-174546f90d18\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.105624 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2941047a-90ba-4b26-8678-174546f90d18-etc-machine-id\") pod \"2941047a-90ba-4b26-8678-174546f90d18\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.105647 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-config-data\") pod \"2941047a-90ba-4b26-8678-174546f90d18\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.105711 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-config-data-custom\") pod \"2941047a-90ba-4b26-8678-174546f90d18\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.105777 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v5st\" (UniqueName: \"kubernetes.io/projected/2941047a-90ba-4b26-8678-174546f90d18-kube-api-access-6v5st\") pod \"2941047a-90ba-4b26-8678-174546f90d18\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.105779 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2941047a-90ba-4b26-8678-174546f90d18-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2941047a-90ba-4b26-8678-174546f90d18" (UID: "2941047a-90ba-4b26-8678-174546f90d18"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.105852 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2941047a-90ba-4b26-8678-174546f90d18-logs\") pod \"2941047a-90ba-4b26-8678-174546f90d18\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.105960 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-combined-ca-bundle\") pod \"2941047a-90ba-4b26-8678-174546f90d18\" (UID: \"2941047a-90ba-4b26-8678-174546f90d18\") " Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.106250 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2941047a-90ba-4b26-8678-174546f90d18-logs" (OuterVolumeSpecName: "logs") pod "2941047a-90ba-4b26-8678-174546f90d18" (UID: "2941047a-90ba-4b26-8678-174546f90d18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.106816 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2941047a-90ba-4b26-8678-174546f90d18-logs\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.106831 4907 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2941047a-90ba-4b26-8678-174546f90d18-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.111945 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2941047a-90ba-4b26-8678-174546f90d18" (UID: "2941047a-90ba-4b26-8678-174546f90d18"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.112513 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-scripts" (OuterVolumeSpecName: "scripts") pod "2941047a-90ba-4b26-8678-174546f90d18" (UID: "2941047a-90ba-4b26-8678-174546f90d18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.118761 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2941047a-90ba-4b26-8678-174546f90d18-kube-api-access-6v5st" (OuterVolumeSpecName: "kube-api-access-6v5st") pod "2941047a-90ba-4b26-8678-174546f90d18" (UID: "2941047a-90ba-4b26-8678-174546f90d18"). InnerVolumeSpecName "kube-api-access-6v5st". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.134088 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2941047a-90ba-4b26-8678-174546f90d18" (UID: "2941047a-90ba-4b26-8678-174546f90d18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.158010 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-config-data" (OuterVolumeSpecName: "config-data") pod "2941047a-90ba-4b26-8678-174546f90d18" (UID: "2941047a-90ba-4b26-8678-174546f90d18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.209060 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.209100 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.209111 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v5st\" (UniqueName: \"kubernetes.io/projected/2941047a-90ba-4b26-8678-174546f90d18-kube-api-access-6v5st\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.209120 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.209128 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2941047a-90ba-4b26-8678-174546f90d18-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.387219 4907 generic.go:334] "Generic (PLEG): container finished" podID="2941047a-90ba-4b26-8678-174546f90d18" containerID="3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241" exitCode=0 Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.387256 4907 generic.go:334] "Generic (PLEG): container finished" podID="2941047a-90ba-4b26-8678-174546f90d18" containerID="1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a" exitCode=143 Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.388229 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.396359 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2941047a-90ba-4b26-8678-174546f90d18","Type":"ContainerDied","Data":"3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241"} Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.396392 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2941047a-90ba-4b26-8678-174546f90d18","Type":"ContainerDied","Data":"1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a"} Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.396403 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2941047a-90ba-4b26-8678-174546f90d18","Type":"ContainerDied","Data":"cc31111472cfc86ae069eaed0a3c0f0282f4be6055ba1bce5670509d6fe0cda7"} Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.396417 4907 scope.go:117] "RemoveContainer" containerID="3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.426143 4907 scope.go:117] "RemoveContainer" containerID="1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.433171 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.448927 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.456884 4907 scope.go:117] "RemoveContainer" containerID="3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241" Nov 29 14:51:55 crc kubenswrapper[4907]: E1129 14:51:55.457300 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241\": container with ID starting with 3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241 not found: ID does not exist" containerID="3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.457328 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241"} err="failed to get container status \"3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241\": rpc error: code = NotFound desc = could not find container \"3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241\": container with ID starting with 3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241 not found: ID does not exist" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.457348 4907 scope.go:117] "RemoveContainer" containerID="1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a" Nov 29 14:51:55 crc kubenswrapper[4907]: E1129 14:51:55.457636 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a\": container with ID starting with 1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a not found: ID does not exist" containerID="1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.457683 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a"} err="failed to get container status \"1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a\": rpc error: code = NotFound desc = could not find container \"1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a\": container with ID starting with 1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a not found: ID does not exist" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.457710 4907 scope.go:117] "RemoveContainer" containerID="3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.457936 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241"} err="failed to get container status \"3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241\": rpc error: code = NotFound desc = could not find container \"3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241\": container with ID starting with 3761c447dbf0517b62d5a88763cfbeb8e80c28e2fbae25178840d78ba1cd5241 not found: ID does not exist" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.457959 4907 scope.go:117] "RemoveContainer" containerID="1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.458123 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a"} err="failed to get container status \"1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a\": rpc error: code = NotFound desc = could not find container \"1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a\": container with ID starting with 1b43dfac9253519adee253e197fbac509ae811d3e263af484a3d16f764d64a8a not found: ID does not exist" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.461323 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 29 14:51:55 crc kubenswrapper[4907]: E1129 14:51:55.461790 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2941047a-90ba-4b26-8678-174546f90d18" containerName="cinder-api" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.461808 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2941047a-90ba-4b26-8678-174546f90d18" containerName="cinder-api" Nov 29 14:51:55 crc kubenswrapper[4907]: E1129 14:51:55.461821 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.461828 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api" Nov 29 14:51:55 crc kubenswrapper[4907]: E1129 14:51:55.461839 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.461845 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" Nov 29 14:51:55 crc kubenswrapper[4907]: E1129 14:51:55.461854 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757103ec-cde8-4a4d-9114-bb121bfe7ffa" containerName="dnsmasq-dns" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.461860 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="757103ec-cde8-4a4d-9114-bb121bfe7ffa" containerName="dnsmasq-dns" Nov 29 14:51:55 crc kubenswrapper[4907]: E1129 14:51:55.461873 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2941047a-90ba-4b26-8678-174546f90d18" containerName="cinder-api-log" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.461879 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2941047a-90ba-4b26-8678-174546f90d18" containerName="cinder-api-log" Nov 29 14:51:55 crc kubenswrapper[4907]: E1129 14:51:55.461903 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.461909 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api" Nov 29 14:51:55 crc kubenswrapper[4907]: E1129 14:51:55.461920 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757103ec-cde8-4a4d-9114-bb121bfe7ffa" containerName="init" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.461925 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="757103ec-cde8-4a4d-9114-bb121bfe7ffa" containerName="init" Nov 29 14:51:55 crc kubenswrapper[4907]: E1129 14:51:55.461945 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.461950 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.462148 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.462164 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.462180 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2941047a-90ba-4b26-8678-174546f90d18" containerName="cinder-api-log" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.462191 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.462199 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2941047a-90ba-4b26-8678-174546f90d18" containerName="cinder-api" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.462221 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="757103ec-cde8-4a4d-9114-bb121bfe7ffa" containerName="dnsmasq-dns" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.462230 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api" Nov 29 14:51:55 crc kubenswrapper[4907]: E1129 14:51:55.462436 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.462462 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api-log" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.462659 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a84a40-6906-41df-8af5-ba4c16a2c322" containerName="barbican-api" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.463362 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.465811 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.465995 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.466096 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.485572 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.617352 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-config-data-custom\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.617396 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-config-data\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.617422 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.617497 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-scripts\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.617515 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7255a0d-394e-4d14-bc92-327e101b6ed3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.617557 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7255a0d-394e-4d14-bc92-327e101b6ed3-logs\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.617577 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.617625 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt7xn\" (UniqueName: \"kubernetes.io/projected/b7255a0d-394e-4d14-bc92-327e101b6ed3-kube-api-access-xt7xn\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.617663 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.719324 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-config-data-custom\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.719370 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-config-data\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.719397 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.719430 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-scripts\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.719462 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7255a0d-394e-4d14-bc92-327e101b6ed3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.719493 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7255a0d-394e-4d14-bc92-327e101b6ed3-logs\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.719510 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.719561 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt7xn\" (UniqueName: \"kubernetes.io/projected/b7255a0d-394e-4d14-bc92-327e101b6ed3-kube-api-access-xt7xn\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.719597 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.721045 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7255a0d-394e-4d14-bc92-327e101b6ed3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.721084 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7255a0d-394e-4d14-bc92-327e101b6ed3-logs\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.724809 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-scripts\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.725168 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.725789 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.725905 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.733454 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-config-data-custom\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.735842 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt7xn\" (UniqueName: \"kubernetes.io/projected/b7255a0d-394e-4d14-bc92-327e101b6ed3-kube-api-access-xt7xn\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.737681 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7255a0d-394e-4d14-bc92-327e101b6ed3-config-data\") pod \"cinder-api-0\" (UID: \"b7255a0d-394e-4d14-bc92-327e101b6ed3\") " pod="openstack/cinder-api-0" Nov 29 14:51:55 crc kubenswrapper[4907]: I1129 14:51:55.784463 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 14:51:56 crc kubenswrapper[4907]: I1129 14:51:56.068160 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-68c5f6d545-tlmv5" Nov 29 14:51:56 crc kubenswrapper[4907]: I1129 14:51:56.135231 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dfc6d84d8-nk2pb"] Nov 29 14:51:56 crc kubenswrapper[4907]: I1129 14:51:56.135471 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-dfc6d84d8-nk2pb" podUID="f3eac663-661d-4bdb-bf65-3d92c9019225" containerName="neutron-api" containerID="cri-o://8fdcdd1b28e6cdbe5dad885dff079712be2bfe4635dd08d1555aa547006ff028" gracePeriod=30 Nov 29 14:51:56 crc kubenswrapper[4907]: I1129 14:51:56.135930 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-dfc6d84d8-nk2pb" podUID="f3eac663-661d-4bdb-bf65-3d92c9019225" containerName="neutron-httpd" containerID="cri-o://7e8cbc93d8d2f69d58bda5141aed2f4b91f0365b241ea68c2b3ff9697d99b781" gracePeriod=30 Nov 29 14:51:56 crc kubenswrapper[4907]: I1129 14:51:56.274300 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 14:51:56 crc kubenswrapper[4907]: I1129 14:51:56.413626 4907 generic.go:334] "Generic (PLEG): container finished" podID="f3eac663-661d-4bdb-bf65-3d92c9019225" containerID="7e8cbc93d8d2f69d58bda5141aed2f4b91f0365b241ea68c2b3ff9697d99b781" exitCode=0 Nov 29 14:51:56 crc kubenswrapper[4907]: I1129 14:51:56.413689 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dfc6d84d8-nk2pb" event={"ID":"f3eac663-661d-4bdb-bf65-3d92c9019225","Type":"ContainerDied","Data":"7e8cbc93d8d2f69d58bda5141aed2f4b91f0365b241ea68c2b3ff9697d99b781"} Nov 29 14:51:56 crc kubenswrapper[4907]: I1129 14:51:56.418781 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b7255a0d-394e-4d14-bc92-327e101b6ed3","Type":"ContainerStarted","Data":"bac764c8bc1b36be206e717244712aba68c22aa3b8c6aa0aea5806117f443ba2"} Nov 29 14:51:56 crc kubenswrapper[4907]: I1129 14:51:56.495788 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2941047a-90ba-4b26-8678-174546f90d18" path="/var/lib/kubelet/pods/2941047a-90ba-4b26-8678-174546f90d18/volumes" Nov 29 14:51:57 crc kubenswrapper[4907]: I1129 14:51:57.441762 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b7255a0d-394e-4d14-bc92-327e101b6ed3","Type":"ContainerStarted","Data":"727f3ccd5b37dd9f5da163d637156cddce8458dd3965e736618d50838d3da253"} Nov 29 14:51:58 crc kubenswrapper[4907]: I1129 14:51:58.453731 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b7255a0d-394e-4d14-bc92-327e101b6ed3","Type":"ContainerStarted","Data":"03194550f9138762d016135830ca0a2d29ac980034a541464a1b3265e26b5d02"} Nov 29 14:51:58 crc kubenswrapper[4907]: I1129 14:51:58.454039 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 29 14:51:58 crc kubenswrapper[4907]: I1129 14:51:58.490404 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.490382419 podStartE2EDuration="3.490382419s" podCreationTimestamp="2025-11-29 14:51:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:51:58.476455482 +0000 UTC m=+1416.463293134" watchObservedRunningTime="2025-11-29 14:51:58.490382419 +0000 UTC m=+1416.477220071" Nov 29 14:51:59 crc kubenswrapper[4907]: I1129 14:51:59.822731 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 29 14:51:59 crc kubenswrapper[4907]: I1129 14:51:59.838654 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:51:59 crc kubenswrapper[4907]: I1129 14:51:59.966775 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.020405 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-s8h6d"] Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.020672 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fb745b69-s8h6d" podUID="07213bf3-24a2-492a-8094-21b774bb7b97" containerName="dnsmasq-dns" containerID="cri-o://4d567e1378c65706be58e4b02188fc84061792e5e0cb65d989fc6545e0a5f7af" gracePeriod=10 Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.489282 4907 generic.go:334] "Generic (PLEG): container finished" podID="07213bf3-24a2-492a-8094-21b774bb7b97" containerID="4d567e1378c65706be58e4b02188fc84061792e5e0cb65d989fc6545e0a5f7af" exitCode=0 Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.489837 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="94ed39fa-abc7-45e2-8919-43609b0bfd9e" containerName="cinder-scheduler" containerID="cri-o://6e8b95828aad3a7fdb08f64e16a61ebe38c796b7443256c1c1822e154094d69f" gracePeriod=30 Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.490382 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="94ed39fa-abc7-45e2-8919-43609b0bfd9e" containerName="probe" containerID="cri-o://51f5d2a1ad3ca7d9742ee845c59c098a5b0ca0d55b018954590275768e6f885d" gracePeriod=30 Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.492930 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-s8h6d" event={"ID":"07213bf3-24a2-492a-8094-21b774bb7b97","Type":"ContainerDied","Data":"4d567e1378c65706be58e4b02188fc84061792e5e0cb65d989fc6545e0a5f7af"} Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.492964 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-s8h6d" event={"ID":"07213bf3-24a2-492a-8094-21b774bb7b97","Type":"ContainerDied","Data":"3847bd34974ba05b87b571ccba6d1ee6151a8b2c129f649dfaa7509e197c26e2"} Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.492975 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3847bd34974ba05b87b571ccba6d1ee6151a8b2c129f649dfaa7509e197c26e2" Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.541756 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.649293 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-ovsdbserver-nb\") pod \"07213bf3-24a2-492a-8094-21b774bb7b97\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.649343 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-config\") pod \"07213bf3-24a2-492a-8094-21b774bb7b97\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.649657 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-ovsdbserver-sb\") pod \"07213bf3-24a2-492a-8094-21b774bb7b97\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.649747 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8klw\" (UniqueName: \"kubernetes.io/projected/07213bf3-24a2-492a-8094-21b774bb7b97-kube-api-access-x8klw\") pod \"07213bf3-24a2-492a-8094-21b774bb7b97\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.649802 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-dns-svc\") pod \"07213bf3-24a2-492a-8094-21b774bb7b97\" (UID: \"07213bf3-24a2-492a-8094-21b774bb7b97\") " Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.660031 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07213bf3-24a2-492a-8094-21b774bb7b97-kube-api-access-x8klw" (OuterVolumeSpecName: "kube-api-access-x8klw") pod "07213bf3-24a2-492a-8094-21b774bb7b97" (UID: "07213bf3-24a2-492a-8094-21b774bb7b97"). InnerVolumeSpecName "kube-api-access-x8klw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.719864 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-config" (OuterVolumeSpecName: "config") pod "07213bf3-24a2-492a-8094-21b774bb7b97" (UID: "07213bf3-24a2-492a-8094-21b774bb7b97"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.723166 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07213bf3-24a2-492a-8094-21b774bb7b97" (UID: "07213bf3-24a2-492a-8094-21b774bb7b97"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.724281 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "07213bf3-24a2-492a-8094-21b774bb7b97" (UID: "07213bf3-24a2-492a-8094-21b774bb7b97"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.733345 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07213bf3-24a2-492a-8094-21b774bb7b97" (UID: "07213bf3-24a2-492a-8094-21b774bb7b97"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.752866 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.752906 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.752919 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.752933 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8klw\" (UniqueName: \"kubernetes.io/projected/07213bf3-24a2-492a-8094-21b774bb7b97-kube-api-access-x8klw\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:00 crc kubenswrapper[4907]: I1129 14:52:00.752948 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07213bf3-24a2-492a-8094-21b774bb7b97-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:01 crc kubenswrapper[4907]: I1129 14:52:01.501250 4907 generic.go:334] "Generic (PLEG): container finished" podID="94ed39fa-abc7-45e2-8919-43609b0bfd9e" containerID="51f5d2a1ad3ca7d9742ee845c59c098a5b0ca0d55b018954590275768e6f885d" exitCode=0 Nov 29 14:52:01 crc kubenswrapper[4907]: I1129 14:52:01.501323 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"94ed39fa-abc7-45e2-8919-43609b0bfd9e","Type":"ContainerDied","Data":"51f5d2a1ad3ca7d9742ee845c59c098a5b0ca0d55b018954590275768e6f885d"} Nov 29 14:52:01 crc kubenswrapper[4907]: I1129 14:52:01.501555 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-s8h6d" Nov 29 14:52:01 crc kubenswrapper[4907]: I1129 14:52:01.535785 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-s8h6d"] Nov 29 14:52:01 crc kubenswrapper[4907]: I1129 14:52:01.548163 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-s8h6d"] Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.504049 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07213bf3-24a2-492a-8094-21b774bb7b97" path="/var/lib/kubelet/pods/07213bf3-24a2-492a-8094-21b774bb7b97/volumes" Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.521759 4907 generic.go:334] "Generic (PLEG): container finished" podID="f3eac663-661d-4bdb-bf65-3d92c9019225" containerID="8fdcdd1b28e6cdbe5dad885dff079712be2bfe4635dd08d1555aa547006ff028" exitCode=0 Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.521809 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dfc6d84d8-nk2pb" event={"ID":"f3eac663-661d-4bdb-bf65-3d92c9019225","Type":"ContainerDied","Data":"8fdcdd1b28e6cdbe5dad885dff079712be2bfe4635dd08d1555aa547006ff028"} Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.521848 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dfc6d84d8-nk2pb" event={"ID":"f3eac663-661d-4bdb-bf65-3d92c9019225","Type":"ContainerDied","Data":"f10d0f544f72f6859d5df15e2c8b30d0815a25da7feda0a5ad83a0c0d9995853"} Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.521871 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f10d0f544f72f6859d5df15e2c8b30d0815a25da7feda0a5ad83a0c0d9995853" Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.591725 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.691545 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-ovndb-tls-certs\") pod \"f3eac663-661d-4bdb-bf65-3d92c9019225\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.691939 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-httpd-config\") pod \"f3eac663-661d-4bdb-bf65-3d92c9019225\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.692085 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-config\") pod \"f3eac663-661d-4bdb-bf65-3d92c9019225\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.692119 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p45gd\" (UniqueName: \"kubernetes.io/projected/f3eac663-661d-4bdb-bf65-3d92c9019225-kube-api-access-p45gd\") pod \"f3eac663-661d-4bdb-bf65-3d92c9019225\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.692243 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-combined-ca-bundle\") pod \"f3eac663-661d-4bdb-bf65-3d92c9019225\" (UID: \"f3eac663-661d-4bdb-bf65-3d92c9019225\") " Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.697913 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3eac663-661d-4bdb-bf65-3d92c9019225-kube-api-access-p45gd" (OuterVolumeSpecName: "kube-api-access-p45gd") pod "f3eac663-661d-4bdb-bf65-3d92c9019225" (UID: "f3eac663-661d-4bdb-bf65-3d92c9019225"). InnerVolumeSpecName "kube-api-access-p45gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.698053 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f3eac663-661d-4bdb-bf65-3d92c9019225" (UID: "f3eac663-661d-4bdb-bf65-3d92c9019225"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.762230 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-config" (OuterVolumeSpecName: "config") pod "f3eac663-661d-4bdb-bf65-3d92c9019225" (UID: "f3eac663-661d-4bdb-bf65-3d92c9019225"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.779657 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3eac663-661d-4bdb-bf65-3d92c9019225" (UID: "f3eac663-661d-4bdb-bf65-3d92c9019225"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.794417 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.794486 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.794499 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p45gd\" (UniqueName: \"kubernetes.io/projected/f3eac663-661d-4bdb-bf65-3d92c9019225-kube-api-access-p45gd\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.794510 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.794941 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f3eac663-661d-4bdb-bf65-3d92c9019225" (UID: "f3eac663-661d-4bdb-bf65-3d92c9019225"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:02 crc kubenswrapper[4907]: I1129 14:52:02.896218 4907 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eac663-661d-4bdb-bf65-3d92c9019225-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:03 crc kubenswrapper[4907]: I1129 14:52:03.536353 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dfc6d84d8-nk2pb" Nov 29 14:52:03 crc kubenswrapper[4907]: I1129 14:52:03.584796 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dfc6d84d8-nk2pb"] Nov 29 14:52:03 crc kubenswrapper[4907]: I1129 14:52:03.595766 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-dfc6d84d8-nk2pb"] Nov 29 14:52:04 crc kubenswrapper[4907]: I1129 14:52:04.496055 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3eac663-661d-4bdb-bf65-3d92c9019225" path="/var/lib/kubelet/pods/f3eac663-661d-4bdb-bf65-3d92c9019225/volumes" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.381855 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.469320 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-config-data-custom\") pod \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.469406 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-combined-ca-bundle\") pod \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.469592 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-scripts\") pod \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.469619 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-config-data\") pod \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.469636 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94ed39fa-abc7-45e2-8919-43609b0bfd9e-etc-machine-id\") pod \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.469706 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78pqv\" (UniqueName: \"kubernetes.io/projected/94ed39fa-abc7-45e2-8919-43609b0bfd9e-kube-api-access-78pqv\") pod \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\" (UID: \"94ed39fa-abc7-45e2-8919-43609b0bfd9e\") " Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.473584 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94ed39fa-abc7-45e2-8919-43609b0bfd9e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "94ed39fa-abc7-45e2-8919-43609b0bfd9e" (UID: "94ed39fa-abc7-45e2-8919-43609b0bfd9e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.481600 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "94ed39fa-abc7-45e2-8919-43609b0bfd9e" (UID: "94ed39fa-abc7-45e2-8919-43609b0bfd9e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.481667 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94ed39fa-abc7-45e2-8919-43609b0bfd9e-kube-api-access-78pqv" (OuterVolumeSpecName: "kube-api-access-78pqv") pod "94ed39fa-abc7-45e2-8919-43609b0bfd9e" (UID: "94ed39fa-abc7-45e2-8919-43609b0bfd9e"). InnerVolumeSpecName "kube-api-access-78pqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.491583 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-scripts" (OuterVolumeSpecName: "scripts") pod "94ed39fa-abc7-45e2-8919-43609b0bfd9e" (UID: "94ed39fa-abc7-45e2-8919-43609b0bfd9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.534794 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94ed39fa-abc7-45e2-8919-43609b0bfd9e" (UID: "94ed39fa-abc7-45e2-8919-43609b0bfd9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.573305 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.573351 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.573361 4907 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94ed39fa-abc7-45e2-8919-43609b0bfd9e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.573370 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78pqv\" (UniqueName: \"kubernetes.io/projected/94ed39fa-abc7-45e2-8919-43609b0bfd9e-kube-api-access-78pqv\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.573381 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.586402 4907 generic.go:334] "Generic (PLEG): container finished" podID="94ed39fa-abc7-45e2-8919-43609b0bfd9e" containerID="6e8b95828aad3a7fdb08f64e16a61ebe38c796b7443256c1c1822e154094d69f" exitCode=0 Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.587124 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"94ed39fa-abc7-45e2-8919-43609b0bfd9e","Type":"ContainerDied","Data":"6e8b95828aad3a7fdb08f64e16a61ebe38c796b7443256c1c1822e154094d69f"} Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.587154 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"94ed39fa-abc7-45e2-8919-43609b0bfd9e","Type":"ContainerDied","Data":"d55703ac9173502f31a62d482f92121a7b22f0e2cdb18764cbd007a18f93fef6"} Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.587170 4907 scope.go:117] "RemoveContainer" containerID="51f5d2a1ad3ca7d9742ee845c59c098a5b0ca0d55b018954590275768e6f885d" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.587339 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.625533 4907 scope.go:117] "RemoveContainer" containerID="6e8b95828aad3a7fdb08f64e16a61ebe38c796b7443256c1c1822e154094d69f" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.660069 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-config-data" (OuterVolumeSpecName: "config-data") pod "94ed39fa-abc7-45e2-8919-43609b0bfd9e" (UID: "94ed39fa-abc7-45e2-8919-43609b0bfd9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.667413 4907 scope.go:117] "RemoveContainer" containerID="51f5d2a1ad3ca7d9742ee845c59c098a5b0ca0d55b018954590275768e6f885d" Nov 29 14:52:05 crc kubenswrapper[4907]: E1129 14:52:05.670508 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51f5d2a1ad3ca7d9742ee845c59c098a5b0ca0d55b018954590275768e6f885d\": container with ID starting with 51f5d2a1ad3ca7d9742ee845c59c098a5b0ca0d55b018954590275768e6f885d not found: ID does not exist" containerID="51f5d2a1ad3ca7d9742ee845c59c098a5b0ca0d55b018954590275768e6f885d" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.670544 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f5d2a1ad3ca7d9742ee845c59c098a5b0ca0d55b018954590275768e6f885d"} err="failed to get container status \"51f5d2a1ad3ca7d9742ee845c59c098a5b0ca0d55b018954590275768e6f885d\": rpc error: code = NotFound desc = could not find container \"51f5d2a1ad3ca7d9742ee845c59c098a5b0ca0d55b018954590275768e6f885d\": container with ID starting with 51f5d2a1ad3ca7d9742ee845c59c098a5b0ca0d55b018954590275768e6f885d not found: ID does not exist" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.670567 4907 scope.go:117] "RemoveContainer" containerID="6e8b95828aad3a7fdb08f64e16a61ebe38c796b7443256c1c1822e154094d69f" Nov 29 14:52:05 crc kubenswrapper[4907]: E1129 14:52:05.674531 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e8b95828aad3a7fdb08f64e16a61ebe38c796b7443256c1c1822e154094d69f\": container with ID starting with 6e8b95828aad3a7fdb08f64e16a61ebe38c796b7443256c1c1822e154094d69f not found: ID does not exist" containerID="6e8b95828aad3a7fdb08f64e16a61ebe38c796b7443256c1c1822e154094d69f" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.674564 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e8b95828aad3a7fdb08f64e16a61ebe38c796b7443256c1c1822e154094d69f"} err="failed to get container status \"6e8b95828aad3a7fdb08f64e16a61ebe38c796b7443256c1c1822e154094d69f\": rpc error: code = NotFound desc = could not find container \"6e8b95828aad3a7fdb08f64e16a61ebe38c796b7443256c1c1822e154094d69f\": container with ID starting with 6e8b95828aad3a7fdb08f64e16a61ebe38c796b7443256c1c1822e154094d69f not found: ID does not exist" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.677049 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ed39fa-abc7-45e2-8919-43609b0bfd9e-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.935034 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.947665 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.960478 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 14:52:05 crc kubenswrapper[4907]: E1129 14:52:05.961356 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94ed39fa-abc7-45e2-8919-43609b0bfd9e" containerName="cinder-scheduler" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.961377 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="94ed39fa-abc7-45e2-8919-43609b0bfd9e" containerName="cinder-scheduler" Nov 29 14:52:05 crc kubenswrapper[4907]: E1129 14:52:05.961391 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eac663-661d-4bdb-bf65-3d92c9019225" containerName="neutron-api" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.961400 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eac663-661d-4bdb-bf65-3d92c9019225" containerName="neutron-api" Nov 29 14:52:05 crc kubenswrapper[4907]: E1129 14:52:05.961422 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07213bf3-24a2-492a-8094-21b774bb7b97" containerName="dnsmasq-dns" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.961430 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="07213bf3-24a2-492a-8094-21b774bb7b97" containerName="dnsmasq-dns" Nov 29 14:52:05 crc kubenswrapper[4907]: E1129 14:52:05.961472 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07213bf3-24a2-492a-8094-21b774bb7b97" containerName="init" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.961491 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="07213bf3-24a2-492a-8094-21b774bb7b97" containerName="init" Nov 29 14:52:05 crc kubenswrapper[4907]: E1129 14:52:05.961523 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eac663-661d-4bdb-bf65-3d92c9019225" containerName="neutron-httpd" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.961532 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eac663-661d-4bdb-bf65-3d92c9019225" containerName="neutron-httpd" Nov 29 14:52:05 crc kubenswrapper[4907]: E1129 14:52:05.961561 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94ed39fa-abc7-45e2-8919-43609b0bfd9e" containerName="probe" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.961570 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="94ed39fa-abc7-45e2-8919-43609b0bfd9e" containerName="probe" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.961815 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="94ed39fa-abc7-45e2-8919-43609b0bfd9e" containerName="cinder-scheduler" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.961828 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3eac663-661d-4bdb-bf65-3d92c9019225" containerName="neutron-httpd" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.961852 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="94ed39fa-abc7-45e2-8919-43609b0bfd9e" containerName="probe" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.961875 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3eac663-661d-4bdb-bf65-3d92c9019225" containerName="neutron-api" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.961894 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="07213bf3-24a2-492a-8094-21b774bb7b97" containerName="dnsmasq-dns" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.963410 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 14:52:05 crc kubenswrapper[4907]: I1129 14:52:05.966634 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.021772 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.092407 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc42ffed-7148-4260-82d7-0b4a2fecc830-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.092730 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc42ffed-7148-4260-82d7-0b4a2fecc830-config-data\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.093035 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc42ffed-7148-4260-82d7-0b4a2fecc830-scripts\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.093129 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc42ffed-7148-4260-82d7-0b4a2fecc830-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.093211 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84h2r\" (UniqueName: \"kubernetes.io/projected/dc42ffed-7148-4260-82d7-0b4a2fecc830-kube-api-access-84h2r\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.093304 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc42ffed-7148-4260-82d7-0b4a2fecc830-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.194959 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc42ffed-7148-4260-82d7-0b4a2fecc830-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.195056 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc42ffed-7148-4260-82d7-0b4a2fecc830-config-data\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.195135 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc42ffed-7148-4260-82d7-0b4a2fecc830-scripts\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.195169 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc42ffed-7148-4260-82d7-0b4a2fecc830-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.195202 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84h2r\" (UniqueName: \"kubernetes.io/projected/dc42ffed-7148-4260-82d7-0b4a2fecc830-kube-api-access-84h2r\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.195236 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc42ffed-7148-4260-82d7-0b4a2fecc830-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.195339 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc42ffed-7148-4260-82d7-0b4a2fecc830-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.199307 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc42ffed-7148-4260-82d7-0b4a2fecc830-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.201289 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc42ffed-7148-4260-82d7-0b4a2fecc830-config-data\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.209008 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.209959 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc42ffed-7148-4260-82d7-0b4a2fecc830-scripts\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.210597 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-65859db6b4-hwsds" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.217091 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc42ffed-7148-4260-82d7-0b4a2fecc830-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.219909 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84h2r\" (UniqueName: \"kubernetes.io/projected/dc42ffed-7148-4260-82d7-0b4a2fecc830-kube-api-access-84h2r\") pod \"cinder-scheduler-0\" (UID: \"dc42ffed-7148-4260-82d7-0b4a2fecc830\") " pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.306872 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.506154 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94ed39fa-abc7-45e2-8919-43609b0bfd9e" path="/var/lib/kubelet/pods/94ed39fa-abc7-45e2-8919-43609b0bfd9e/volumes" Nov 29 14:52:06 crc kubenswrapper[4907]: I1129 14:52:06.901135 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 14:52:07 crc kubenswrapper[4907]: I1129 14:52:07.354017 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-76d8ccf675-75wqf" Nov 29 14:52:07 crc kubenswrapper[4907]: I1129 14:52:07.627014 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dc42ffed-7148-4260-82d7-0b4a2fecc830","Type":"ContainerStarted","Data":"8e9e24ef8020dd123f40b3492cc30dbad485133367c414ce6f2cc977bc6863d8"} Nov 29 14:52:07 crc kubenswrapper[4907]: I1129 14:52:07.627267 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dc42ffed-7148-4260-82d7-0b4a2fecc830","Type":"ContainerStarted","Data":"bd3290ccef66d4e69b0dfe2694ab8b55abb64cd9eaeef5399ad87c21fd853945"} Nov 29 14:52:08 crc kubenswrapper[4907]: I1129 14:52:08.047684 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 29 14:52:08 crc kubenswrapper[4907]: I1129 14:52:08.636874 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dc42ffed-7148-4260-82d7-0b4a2fecc830","Type":"ContainerStarted","Data":"4be55c465ae625516eef603ad4b800837bbc9fe045106f6d11cbedb83e39f8f1"} Nov 29 14:52:08 crc kubenswrapper[4907]: I1129 14:52:08.665897 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.665878941 podStartE2EDuration="3.665878941s" podCreationTimestamp="2025-11-29 14:52:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:52:08.655827584 +0000 UTC m=+1426.642665236" watchObservedRunningTime="2025-11-29 14:52:08.665878941 +0000 UTC m=+1426.652716593" Nov 29 14:52:11 crc kubenswrapper[4907]: I1129 14:52:11.307429 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.408878 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-644d94d9d7-tvfbj"] Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.411843 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.418795 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-f988f" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.419193 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.419394 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.444672 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-644d94d9d7-tvfbj"] Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.536303 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.544406 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.550665 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djvc6\" (UniqueName: \"kubernetes.io/projected/df2182cf-1b22-424b-8c39-e44567b07d45-kube-api-access-djvc6\") pod \"heat-engine-644d94d9d7-tvfbj\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.550786 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-config-data\") pod \"heat-engine-644d94d9d7-tvfbj\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.551030 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-config-data-custom\") pod \"heat-engine-644d94d9d7-tvfbj\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.551067 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-combined-ca-bundle\") pod \"heat-engine-644d94d9d7-tvfbj\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.556959 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.557310 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.566332 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-8hfn7" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.653058 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djvc6\" (UniqueName: \"kubernetes.io/projected/df2182cf-1b22-424b-8c39-e44567b07d45-kube-api-access-djvc6\") pod \"heat-engine-644d94d9d7-tvfbj\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.653128 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/71aeb8b9-6bde-4a3e-a6f1-6d7c192490be-openstack-config-secret\") pod \"openstackclient\" (UID: \"71aeb8b9-6bde-4a3e-a6f1-6d7c192490be\") " pod="openstack/openstackclient" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.653168 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrl62\" (UniqueName: \"kubernetes.io/projected/71aeb8b9-6bde-4a3e-a6f1-6d7c192490be-kube-api-access-xrl62\") pod \"openstackclient\" (UID: \"71aeb8b9-6bde-4a3e-a6f1-6d7c192490be\") " pod="openstack/openstackclient" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.653255 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-config-data\") pod \"heat-engine-644d94d9d7-tvfbj\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.653318 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/71aeb8b9-6bde-4a3e-a6f1-6d7c192490be-openstack-config\") pod \"openstackclient\" (UID: \"71aeb8b9-6bde-4a3e-a6f1-6d7c192490be\") " pod="openstack/openstackclient" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.653354 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71aeb8b9-6bde-4a3e-a6f1-6d7c192490be-combined-ca-bundle\") pod \"openstackclient\" (UID: \"71aeb8b9-6bde-4a3e-a6f1-6d7c192490be\") " pod="openstack/openstackclient" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.653339 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-txdv6"] Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.653539 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-config-data-custom\") pod \"heat-engine-644d94d9d7-tvfbj\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.653575 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-combined-ca-bundle\") pod \"heat-engine-644d94d9d7-tvfbj\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.655357 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.659588 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-config-data\") pod \"heat-engine-644d94d9d7-tvfbj\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.663494 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-config-data-custom\") pod \"heat-engine-644d94d9d7-tvfbj\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.665228 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-txdv6"] Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.682028 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-combined-ca-bundle\") pod \"heat-engine-644d94d9d7-tvfbj\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.684790 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.736169 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-568fdbf79-c6gqs"] Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.737626 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.740349 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.755617 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.755664 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/71aeb8b9-6bde-4a3e-a6f1-6d7c192490be-openstack-config\") pod \"openstackclient\" (UID: \"71aeb8b9-6bde-4a3e-a6f1-6d7c192490be\") " pod="openstack/openstackclient" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.755698 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71aeb8b9-6bde-4a3e-a6f1-6d7c192490be-combined-ca-bundle\") pod \"openstackclient\" (UID: \"71aeb8b9-6bde-4a3e-a6f1-6d7c192490be\") " pod="openstack/openstackclient" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.755716 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8d26\" (UniqueName: \"kubernetes.io/projected/3dbbcded-256c-4054-9ba2-a7b1bde35aea-kube-api-access-l8d26\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.755750 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-config\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.755825 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.755863 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.755891 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/71aeb8b9-6bde-4a3e-a6f1-6d7c192490be-openstack-config-secret\") pod \"openstackclient\" (UID: \"71aeb8b9-6bde-4a3e-a6f1-6d7c192490be\") " pod="openstack/openstackclient" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.755913 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrl62\" (UniqueName: \"kubernetes.io/projected/71aeb8b9-6bde-4a3e-a6f1-6d7c192490be-kube-api-access-xrl62\") pod \"openstackclient\" (UID: \"71aeb8b9-6bde-4a3e-a6f1-6d7c192490be\") " pod="openstack/openstackclient" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.755932 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.757036 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/71aeb8b9-6bde-4a3e-a6f1-6d7c192490be-openstack-config\") pod \"openstackclient\" (UID: \"71aeb8b9-6bde-4a3e-a6f1-6d7c192490be\") " pod="openstack/openstackclient" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.805571 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-568fdbf79-c6gqs"] Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.857945 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.857990 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8vgc\" (UniqueName: \"kubernetes.io/projected/e7817d43-2e6d-4f73-a592-22cd7fcf8787-kube-api-access-m8vgc\") pod \"heat-cfnapi-568fdbf79-c6gqs\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.858032 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.858072 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-config-data-custom\") pod \"heat-cfnapi-568fdbf79-c6gqs\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.858111 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.858167 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-config-data\") pod \"heat-cfnapi-568fdbf79-c6gqs\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.858194 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.858209 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-combined-ca-bundle\") pod \"heat-cfnapi-568fdbf79-c6gqs\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.858241 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8d26\" (UniqueName: \"kubernetes.io/projected/3dbbcded-256c-4054-9ba2-a7b1bde35aea-kube-api-access-l8d26\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.858274 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-config\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.859238 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-config\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.859313 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.859770 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.861079 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.861620 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.889816 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5c474cb7d8-pslcd"] Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.891458 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.898790 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.900745 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djvc6\" (UniqueName: \"kubernetes.io/projected/df2182cf-1b22-424b-8c39-e44567b07d45-kube-api-access-djvc6\") pod \"heat-engine-644d94d9d7-tvfbj\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.901223 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/71aeb8b9-6bde-4a3e-a6f1-6d7c192490be-openstack-config-secret\") pod \"openstackclient\" (UID: \"71aeb8b9-6bde-4a3e-a6f1-6d7c192490be\") " pod="openstack/openstackclient" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.918014 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71aeb8b9-6bde-4a3e-a6f1-6d7c192490be-combined-ca-bundle\") pod \"openstackclient\" (UID: \"71aeb8b9-6bde-4a3e-a6f1-6d7c192490be\") " pod="openstack/openstackclient" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.920266 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrl62\" (UniqueName: \"kubernetes.io/projected/71aeb8b9-6bde-4a3e-a6f1-6d7c192490be-kube-api-access-xrl62\") pod \"openstackclient\" (UID: \"71aeb8b9-6bde-4a3e-a6f1-6d7c192490be\") " pod="openstack/openstackclient" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.922545 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5c474cb7d8-pslcd"] Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.947038 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8d26\" (UniqueName: \"kubernetes.io/projected/3dbbcded-256c-4054-9ba2-a7b1bde35aea-kube-api-access-l8d26\") pod \"dnsmasq-dns-7756b9d78c-txdv6\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.962719 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-combined-ca-bundle\") pod \"heat-api-5c474cb7d8-pslcd\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.962793 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8vgc\" (UniqueName: \"kubernetes.io/projected/e7817d43-2e6d-4f73-a592-22cd7fcf8787-kube-api-access-m8vgc\") pod \"heat-cfnapi-568fdbf79-c6gqs\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.962869 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-config-data-custom\") pod \"heat-cfnapi-568fdbf79-c6gqs\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.962910 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-config-data\") pod \"heat-api-5c474cb7d8-pslcd\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.962959 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-config-data\") pod \"heat-cfnapi-568fdbf79-c6gqs\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.962984 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-combined-ca-bundle\") pod \"heat-cfnapi-568fdbf79-c6gqs\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.963026 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-config-data-custom\") pod \"heat-api-5c474cb7d8-pslcd\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.963058 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snx2j\" (UniqueName: \"kubernetes.io/projected/1234755e-c8a2-4e9a-98ee-b700f1703728-kube-api-access-snx2j\") pod \"heat-api-5c474cb7d8-pslcd\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.967240 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-config-data-custom\") pod \"heat-cfnapi-568fdbf79-c6gqs\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.969385 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-combined-ca-bundle\") pod \"heat-cfnapi-568fdbf79-c6gqs\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:12 crc kubenswrapper[4907]: I1129 14:52:12.970710 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-config-data\") pod \"heat-cfnapi-568fdbf79-c6gqs\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.019865 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8vgc\" (UniqueName: \"kubernetes.io/projected/e7817d43-2e6d-4f73-a592-22cd7fcf8787-kube-api-access-m8vgc\") pod \"heat-cfnapi-568fdbf79-c6gqs\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.050553 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.060752 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.064882 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-config-data\") pod \"heat-api-5c474cb7d8-pslcd\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.065022 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-config-data-custom\") pod \"heat-api-5c474cb7d8-pslcd\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.065066 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snx2j\" (UniqueName: \"kubernetes.io/projected/1234755e-c8a2-4e9a-98ee-b700f1703728-kube-api-access-snx2j\") pod \"heat-api-5c474cb7d8-pslcd\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.065124 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-combined-ca-bundle\") pod \"heat-api-5c474cb7d8-pslcd\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.071564 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-config-data-custom\") pod \"heat-api-5c474cb7d8-pslcd\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.072309 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-combined-ca-bundle\") pod \"heat-api-5c474cb7d8-pslcd\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.079280 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.079415 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-config-data\") pod \"heat-api-5c474cb7d8-pslcd\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.088195 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snx2j\" (UniqueName: \"kubernetes.io/projected/1234755e-c8a2-4e9a-98ee-b700f1703728-kube-api-access-snx2j\") pod \"heat-api-5c474cb7d8-pslcd\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.171738 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.388768 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.504735 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6c8fc64d77-lnt4r"] Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.508824 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.512020 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.512327 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.512491 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.559298 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6c8fc64d77-lnt4r"] Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.584215 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89877a72-fedb-44ba-abe3-f74344119594-log-httpd\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.584271 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89877a72-fedb-44ba-abe3-f74344119594-public-tls-certs\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.584317 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch99d\" (UniqueName: \"kubernetes.io/projected/89877a72-fedb-44ba-abe3-f74344119594-kube-api-access-ch99d\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.584334 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89877a72-fedb-44ba-abe3-f74344119594-run-httpd\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.584363 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/89877a72-fedb-44ba-abe3-f74344119594-etc-swift\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.584402 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89877a72-fedb-44ba-abe3-f74344119594-internal-tls-certs\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.584542 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89877a72-fedb-44ba-abe3-f74344119594-combined-ca-bundle\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.584616 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89877a72-fedb-44ba-abe3-f74344119594-config-data\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.689880 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89877a72-fedb-44ba-abe3-f74344119594-log-httpd\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.689932 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89877a72-fedb-44ba-abe3-f74344119594-public-tls-certs\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.689971 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch99d\" (UniqueName: \"kubernetes.io/projected/89877a72-fedb-44ba-abe3-f74344119594-kube-api-access-ch99d\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.689998 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89877a72-fedb-44ba-abe3-f74344119594-run-httpd\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.690061 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/89877a72-fedb-44ba-abe3-f74344119594-etc-swift\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.690373 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89877a72-fedb-44ba-abe3-f74344119594-log-httpd\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.690413 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89877a72-fedb-44ba-abe3-f74344119594-run-httpd\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.690509 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89877a72-fedb-44ba-abe3-f74344119594-internal-tls-certs\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.690582 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89877a72-fedb-44ba-abe3-f74344119594-combined-ca-bundle\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.690651 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89877a72-fedb-44ba-abe3-f74344119594-config-data\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.697377 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89877a72-fedb-44ba-abe3-f74344119594-internal-tls-certs\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.701330 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/89877a72-fedb-44ba-abe3-f74344119594-etc-swift\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.704366 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89877a72-fedb-44ba-abe3-f74344119594-combined-ca-bundle\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.704394 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89877a72-fedb-44ba-abe3-f74344119594-config-data\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.706293 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89877a72-fedb-44ba-abe3-f74344119594-public-tls-certs\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.731109 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch99d\" (UniqueName: \"kubernetes.io/projected/89877a72-fedb-44ba-abe3-f74344119594-kube-api-access-ch99d\") pod \"swift-proxy-6c8fc64d77-lnt4r\" (UID: \"89877a72-fedb-44ba-abe3-f74344119594\") " pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.756322 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-txdv6"] Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.856425 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-644d94d9d7-tvfbj"] Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.864429 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:13 crc kubenswrapper[4907]: I1129 14:52:13.887063 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-568fdbf79-c6gqs"] Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.035772 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.238065 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5c474cb7d8-pslcd"] Nov 29 14:52:14 crc kubenswrapper[4907]: W1129 14:52:14.243153 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1234755e_c8a2_4e9a_98ee_b700f1703728.slice/crio-7831697a8fbe52285e60692ee0c741357ca52c40b0e2ecbd376b3831537ad177 WatchSource:0}: Error finding container 7831697a8fbe52285e60692ee0c741357ca52c40b0e2ecbd376b3831537ad177: Status 404 returned error can't find the container with id 7831697a8fbe52285e60692ee0c741357ca52c40b0e2ecbd376b3831537ad177 Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.614396 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6c8fc64d77-lnt4r"] Nov 29 14:52:14 crc kubenswrapper[4907]: W1129 14:52:14.644274 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89877a72_fedb_44ba_abe3_f74344119594.slice/crio-8dc2041504ec42c2de4f42f61004acac3f71f1151c75f07e236511294072674f WatchSource:0}: Error finding container 8dc2041504ec42c2de4f42f61004acac3f71f1151c75f07e236511294072674f: Status 404 returned error can't find the container with id 8dc2041504ec42c2de4f42f61004acac3f71f1151c75f07e236511294072674f Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.722636 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-kdt5z"] Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.724019 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kdt5z" Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.770817 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kdt5z"] Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.803007 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-568fdbf79-c6gqs" event={"ID":"e7817d43-2e6d-4f73-a592-22cd7fcf8787","Type":"ContainerStarted","Data":"78d7135fa082f746944c869d0ff251b9f9b7ec5f1b261e4b532979ef7463cef8"} Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.803978 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5c474cb7d8-pslcd" event={"ID":"1234755e-c8a2-4e9a-98ee-b700f1703728","Type":"ContainerStarted","Data":"7831697a8fbe52285e60692ee0c741357ca52c40b0e2ecbd376b3831537ad177"} Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.806358 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-644d94d9d7-tvfbj" event={"ID":"df2182cf-1b22-424b-8c39-e44567b07d45","Type":"ContainerStarted","Data":"66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f"} Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.806398 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-644d94d9d7-tvfbj" event={"ID":"df2182cf-1b22-424b-8c39-e44567b07d45","Type":"ContainerStarted","Data":"84bcfbd040f2c72fc6d1dad34dc1962763f1e56599004736a2489df9b54521ef"} Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.806550 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.808890 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c8fc64d77-lnt4r" event={"ID":"89877a72-fedb-44ba-abe3-f74344119594","Type":"ContainerStarted","Data":"8dc2041504ec42c2de4f42f61004acac3f71f1151c75f07e236511294072674f"} Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.815644 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" event={"ID":"3dbbcded-256c-4054-9ba2-a7b1bde35aea","Type":"ContainerDied","Data":"692e03ce2127d95d592ee435df7b8722f8952bc2f7d1601a0d0f197703f593dc"} Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.813317 4907 generic.go:334] "Generic (PLEG): container finished" podID="3dbbcded-256c-4054-9ba2-a7b1bde35aea" containerID="692e03ce2127d95d592ee435df7b8722f8952bc2f7d1601a0d0f197703f593dc" exitCode=0 Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.815703 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" event={"ID":"3dbbcded-256c-4054-9ba2-a7b1bde35aea","Type":"ContainerStarted","Data":"d977c1cc4cf526dd4bdaf7e4b8a62b5ef27168eec96f92fbab4252fb6f49fd14"} Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.821449 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"71aeb8b9-6bde-4a3e-a6f1-6d7c192490be","Type":"ContainerStarted","Data":"a5a5633f07fbdae9793306efd318d4bdd47df9ca0451a75abbfee33bb243f180"} Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.831022 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pnhr\" (UniqueName: \"kubernetes.io/projected/03f910f6-7e96-4d81-bf3f-5b7291b2da09-kube-api-access-8pnhr\") pod \"nova-api-db-create-kdt5z\" (UID: \"03f910f6-7e96-4d81-bf3f-5b7291b2da09\") " pod="openstack/nova-api-db-create-kdt5z" Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.831125 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03f910f6-7e96-4d81-bf3f-5b7291b2da09-operator-scripts\") pod \"nova-api-db-create-kdt5z\" (UID: \"03f910f6-7e96-4d81-bf3f-5b7291b2da09\") " pod="openstack/nova-api-db-create-kdt5z" Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.855570 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-644d94d9d7-tvfbj" podStartSLOduration=2.855547522 podStartE2EDuration="2.855547522s" podCreationTimestamp="2025-11-29 14:52:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:52:14.826905436 +0000 UTC m=+1432.813743098" watchObservedRunningTime="2025-11-29 14:52:14.855547522 +0000 UTC m=+1432.842385174" Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.922536 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-r68r9"] Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.924251 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r68r9" Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.934003 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pnhr\" (UniqueName: \"kubernetes.io/projected/03f910f6-7e96-4d81-bf3f-5b7291b2da09-kube-api-access-8pnhr\") pod \"nova-api-db-create-kdt5z\" (UID: \"03f910f6-7e96-4d81-bf3f-5b7291b2da09\") " pod="openstack/nova-api-db-create-kdt5z" Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.934085 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03f910f6-7e96-4d81-bf3f-5b7291b2da09-operator-scripts\") pod \"nova-api-db-create-kdt5z\" (UID: \"03f910f6-7e96-4d81-bf3f-5b7291b2da09\") " pod="openstack/nova-api-db-create-kdt5z" Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.934797 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03f910f6-7e96-4d81-bf3f-5b7291b2da09-operator-scripts\") pod \"nova-api-db-create-kdt5z\" (UID: \"03f910f6-7e96-4d81-bf3f-5b7291b2da09\") " pod="openstack/nova-api-db-create-kdt5z" Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.967057 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-4909-account-create-update-nsndx"] Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.969568 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4909-account-create-update-nsndx" Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.977781 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 29 14:52:14 crc kubenswrapper[4907]: I1129 14:52:14.978426 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pnhr\" (UniqueName: \"kubernetes.io/projected/03f910f6-7e96-4d81-bf3f-5b7291b2da09-kube-api-access-8pnhr\") pod \"nova-api-db-create-kdt5z\" (UID: \"03f910f6-7e96-4d81-bf3f-5b7291b2da09\") " pod="openstack/nova-api-db-create-kdt5z" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.008489 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-r68r9"] Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.026126 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4909-account-create-update-nsndx"] Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.037059 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e61658d-77a7-46d8-9718-b3077f6f5ff6-operator-scripts\") pod \"nova-cell0-db-create-r68r9\" (UID: \"7e61658d-77a7-46d8-9718-b3077f6f5ff6\") " pod="openstack/nova-cell0-db-create-r68r9" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.037115 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ca437b-20a0-410b-b071-101b1ebe27cb-operator-scripts\") pod \"nova-api-4909-account-create-update-nsndx\" (UID: \"35ca437b-20a0-410b-b071-101b1ebe27cb\") " pod="openstack/nova-api-4909-account-create-update-nsndx" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.037167 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fpsw\" (UniqueName: \"kubernetes.io/projected/7e61658d-77a7-46d8-9718-b3077f6f5ff6-kube-api-access-6fpsw\") pod \"nova-cell0-db-create-r68r9\" (UID: \"7e61658d-77a7-46d8-9718-b3077f6f5ff6\") " pod="openstack/nova-cell0-db-create-r68r9" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.037186 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gx6j\" (UniqueName: \"kubernetes.io/projected/35ca437b-20a0-410b-b071-101b1ebe27cb-kube-api-access-9gx6j\") pod \"nova-api-4909-account-create-update-nsndx\" (UID: \"35ca437b-20a0-410b-b071-101b1ebe27cb\") " pod="openstack/nova-api-4909-account-create-update-nsndx" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.049275 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-z87qn"] Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.050775 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z87qn" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.101004 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kdt5z" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.106534 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-z87qn"] Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.119112 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.134539 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-a773-account-create-update-8ppqm"] Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.136325 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a773-account-create-update-8ppqm" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.139618 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.139694 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e61658d-77a7-46d8-9718-b3077f6f5ff6-operator-scripts\") pod \"nova-cell0-db-create-r68r9\" (UID: \"7e61658d-77a7-46d8-9718-b3077f6f5ff6\") " pod="openstack/nova-cell0-db-create-r68r9" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.139763 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxqqj\" (UniqueName: \"kubernetes.io/projected/a01a3b28-aaff-4cda-9908-f08d0d675669-kube-api-access-vxqqj\") pod \"nova-cell1-db-create-z87qn\" (UID: \"a01a3b28-aaff-4cda-9908-f08d0d675669\") " pod="openstack/nova-cell1-db-create-z87qn" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.139789 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ca437b-20a0-410b-b071-101b1ebe27cb-operator-scripts\") pod \"nova-api-4909-account-create-update-nsndx\" (UID: \"35ca437b-20a0-410b-b071-101b1ebe27cb\") " pod="openstack/nova-api-4909-account-create-update-nsndx" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.139882 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fpsw\" (UniqueName: \"kubernetes.io/projected/7e61658d-77a7-46d8-9718-b3077f6f5ff6-kube-api-access-6fpsw\") pod \"nova-cell0-db-create-r68r9\" (UID: \"7e61658d-77a7-46d8-9718-b3077f6f5ff6\") " pod="openstack/nova-cell0-db-create-r68r9" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.139903 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gx6j\" (UniqueName: \"kubernetes.io/projected/35ca437b-20a0-410b-b071-101b1ebe27cb-kube-api-access-9gx6j\") pod \"nova-api-4909-account-create-update-nsndx\" (UID: \"35ca437b-20a0-410b-b071-101b1ebe27cb\") " pod="openstack/nova-api-4909-account-create-update-nsndx" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.139968 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a01a3b28-aaff-4cda-9908-f08d0d675669-operator-scripts\") pod \"nova-cell1-db-create-z87qn\" (UID: \"a01a3b28-aaff-4cda-9908-f08d0d675669\") " pod="openstack/nova-cell1-db-create-z87qn" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.146391 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a773-account-create-update-8ppqm"] Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.147748 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ca437b-20a0-410b-b071-101b1ebe27cb-operator-scripts\") pod \"nova-api-4909-account-create-update-nsndx\" (UID: \"35ca437b-20a0-410b-b071-101b1ebe27cb\") " pod="openstack/nova-api-4909-account-create-update-nsndx" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.147761 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e61658d-77a7-46d8-9718-b3077f6f5ff6-operator-scripts\") pod \"nova-cell0-db-create-r68r9\" (UID: \"7e61658d-77a7-46d8-9718-b3077f6f5ff6\") " pod="openstack/nova-cell0-db-create-r68r9" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.182482 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gx6j\" (UniqueName: \"kubernetes.io/projected/35ca437b-20a0-410b-b071-101b1ebe27cb-kube-api-access-9gx6j\") pod \"nova-api-4909-account-create-update-nsndx\" (UID: \"35ca437b-20a0-410b-b071-101b1ebe27cb\") " pod="openstack/nova-api-4909-account-create-update-nsndx" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.190910 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fpsw\" (UniqueName: \"kubernetes.io/projected/7e61658d-77a7-46d8-9718-b3077f6f5ff6-kube-api-access-6fpsw\") pod \"nova-cell0-db-create-r68r9\" (UID: \"7e61658d-77a7-46d8-9718-b3077f6f5ff6\") " pod="openstack/nova-cell0-db-create-r68r9" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.242369 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a01a3b28-aaff-4cda-9908-f08d0d675669-operator-scripts\") pod \"nova-cell1-db-create-z87qn\" (UID: \"a01a3b28-aaff-4cda-9908-f08d0d675669\") " pod="openstack/nova-cell1-db-create-z87qn" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.242526 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddngw\" (UniqueName: \"kubernetes.io/projected/61df189f-119b-46fd-877c-87265a18f8c5-kube-api-access-ddngw\") pod \"nova-cell0-a773-account-create-update-8ppqm\" (UID: \"61df189f-119b-46fd-877c-87265a18f8c5\") " pod="openstack/nova-cell0-a773-account-create-update-8ppqm" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.242618 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxqqj\" (UniqueName: \"kubernetes.io/projected/a01a3b28-aaff-4cda-9908-f08d0d675669-kube-api-access-vxqqj\") pod \"nova-cell1-db-create-z87qn\" (UID: \"a01a3b28-aaff-4cda-9908-f08d0d675669\") " pod="openstack/nova-cell1-db-create-z87qn" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.242651 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61df189f-119b-46fd-877c-87265a18f8c5-operator-scripts\") pod \"nova-cell0-a773-account-create-update-8ppqm\" (UID: \"61df189f-119b-46fd-877c-87265a18f8c5\") " pod="openstack/nova-cell0-a773-account-create-update-8ppqm" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.243027 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a01a3b28-aaff-4cda-9908-f08d0d675669-operator-scripts\") pod \"nova-cell1-db-create-z87qn\" (UID: \"a01a3b28-aaff-4cda-9908-f08d0d675669\") " pod="openstack/nova-cell1-db-create-z87qn" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.261421 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxqqj\" (UniqueName: \"kubernetes.io/projected/a01a3b28-aaff-4cda-9908-f08d0d675669-kube-api-access-vxqqj\") pod \"nova-cell1-db-create-z87qn\" (UID: \"a01a3b28-aaff-4cda-9908-f08d0d675669\") " pod="openstack/nova-cell1-db-create-z87qn" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.317987 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-fabb-account-create-update-wthdh"] Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.320306 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fabb-account-create-update-wthdh" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.323889 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fabb-account-create-update-wthdh"] Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.330557 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.346938 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpk9s\" (UniqueName: \"kubernetes.io/projected/0d19cf04-d99b-4e24-ae1d-f5bf69e3d926-kube-api-access-fpk9s\") pod \"nova-cell1-fabb-account-create-update-wthdh\" (UID: \"0d19cf04-d99b-4e24-ae1d-f5bf69e3d926\") " pod="openstack/nova-cell1-fabb-account-create-update-wthdh" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.346989 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d19cf04-d99b-4e24-ae1d-f5bf69e3d926-operator-scripts\") pod \"nova-cell1-fabb-account-create-update-wthdh\" (UID: \"0d19cf04-d99b-4e24-ae1d-f5bf69e3d926\") " pod="openstack/nova-cell1-fabb-account-create-update-wthdh" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.347053 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61df189f-119b-46fd-877c-87265a18f8c5-operator-scripts\") pod \"nova-cell0-a773-account-create-update-8ppqm\" (UID: \"61df189f-119b-46fd-877c-87265a18f8c5\") " pod="openstack/nova-cell0-a773-account-create-update-8ppqm" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.347220 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddngw\" (UniqueName: \"kubernetes.io/projected/61df189f-119b-46fd-877c-87265a18f8c5-kube-api-access-ddngw\") pod \"nova-cell0-a773-account-create-update-8ppqm\" (UID: \"61df189f-119b-46fd-877c-87265a18f8c5\") " pod="openstack/nova-cell0-a773-account-create-update-8ppqm" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.349808 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61df189f-119b-46fd-877c-87265a18f8c5-operator-scripts\") pod \"nova-cell0-a773-account-create-update-8ppqm\" (UID: \"61df189f-119b-46fd-877c-87265a18f8c5\") " pod="openstack/nova-cell0-a773-account-create-update-8ppqm" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.378103 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r68r9" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.394500 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddngw\" (UniqueName: \"kubernetes.io/projected/61df189f-119b-46fd-877c-87265a18f8c5-kube-api-access-ddngw\") pod \"nova-cell0-a773-account-create-update-8ppqm\" (UID: \"61df189f-119b-46fd-877c-87265a18f8c5\") " pod="openstack/nova-cell0-a773-account-create-update-8ppqm" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.402502 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4909-account-create-update-nsndx" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.429559 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z87qn" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.451831 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpk9s\" (UniqueName: \"kubernetes.io/projected/0d19cf04-d99b-4e24-ae1d-f5bf69e3d926-kube-api-access-fpk9s\") pod \"nova-cell1-fabb-account-create-update-wthdh\" (UID: \"0d19cf04-d99b-4e24-ae1d-f5bf69e3d926\") " pod="openstack/nova-cell1-fabb-account-create-update-wthdh" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.451888 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d19cf04-d99b-4e24-ae1d-f5bf69e3d926-operator-scripts\") pod \"nova-cell1-fabb-account-create-update-wthdh\" (UID: \"0d19cf04-d99b-4e24-ae1d-f5bf69e3d926\") " pod="openstack/nova-cell1-fabb-account-create-update-wthdh" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.452929 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d19cf04-d99b-4e24-ae1d-f5bf69e3d926-operator-scripts\") pod \"nova-cell1-fabb-account-create-update-wthdh\" (UID: \"0d19cf04-d99b-4e24-ae1d-f5bf69e3d926\") " pod="openstack/nova-cell1-fabb-account-create-update-wthdh" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.465349 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a773-account-create-update-8ppqm" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.491102 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpk9s\" (UniqueName: \"kubernetes.io/projected/0d19cf04-d99b-4e24-ae1d-f5bf69e3d926-kube-api-access-fpk9s\") pod \"nova-cell1-fabb-account-create-update-wthdh\" (UID: \"0d19cf04-d99b-4e24-ae1d-f5bf69e3d926\") " pod="openstack/nova-cell1-fabb-account-create-update-wthdh" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.769838 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kdt5z"] Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.781765 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fabb-account-create-update-wthdh" Nov 29 14:52:15 crc kubenswrapper[4907]: W1129 14:52:15.888205 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03f910f6_7e96_4d81_bf3f_5b7291b2da09.slice/crio-b2cf4553ca9344e1a41d1545b4c5aa11e756a144495ee3dfa48a5c649362bbe8 WatchSource:0}: Error finding container b2cf4553ca9344e1a41d1545b4c5aa11e756a144495ee3dfa48a5c649362bbe8: Status 404 returned error can't find the container with id b2cf4553ca9344e1a41d1545b4c5aa11e756a144495ee3dfa48a5c649362bbe8 Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.905065 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" event={"ID":"3dbbcded-256c-4054-9ba2-a7b1bde35aea","Type":"ContainerStarted","Data":"3bef29de5a29be4092c04133441b92f471a10f690aae50445330e2020cee03db"} Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.906210 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.929657 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c8fc64d77-lnt4r" event={"ID":"89877a72-fedb-44ba-abe3-f74344119594","Type":"ContainerStarted","Data":"d476e87fc8fb360d57e07940588d8d9d48547eec2757cc963e5d3be66e3b6014"} Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.929709 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.929724 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c8fc64d77-lnt4r" event={"ID":"89877a72-fedb-44ba-abe3-f74344119594","Type":"ContainerStarted","Data":"c4e8a3101de181b7bdaf7257c4242f4f88183d1128bb4ddc9d555b024fdcfd4c"} Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.929837 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.953924 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" podStartSLOduration=3.953900647 podStartE2EDuration="3.953900647s" podCreationTimestamp="2025-11-29 14:52:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:52:15.937476949 +0000 UTC m=+1433.924314621" watchObservedRunningTime="2025-11-29 14:52:15.953900647 +0000 UTC m=+1433.940738299" Nov 29 14:52:15 crc kubenswrapper[4907]: I1129 14:52:15.984832 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6c8fc64d77-lnt4r" podStartSLOduration=2.984811048 podStartE2EDuration="2.984811048s" podCreationTimestamp="2025-11-29 14:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:52:15.959936149 +0000 UTC m=+1433.946773801" watchObservedRunningTime="2025-11-29 14:52:15.984811048 +0000 UTC m=+1433.971648700" Nov 29 14:52:16 crc kubenswrapper[4907]: I1129 14:52:16.450961 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4909-account-create-update-nsndx"] Nov 29 14:52:16 crc kubenswrapper[4907]: I1129 14:52:16.467527 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a773-account-create-update-8ppqm"] Nov 29 14:52:16 crc kubenswrapper[4907]: I1129 14:52:16.518688 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-r68r9"] Nov 29 14:52:16 crc kubenswrapper[4907]: I1129 14:52:16.518727 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-z87qn"] Nov 29 14:52:16 crc kubenswrapper[4907]: I1129 14:52:16.689959 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fabb-account-create-update-wthdh"] Nov 29 14:52:16 crc kubenswrapper[4907]: I1129 14:52:16.810669 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 29 14:52:16 crc kubenswrapper[4907]: I1129 14:52:16.960908 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z87qn" event={"ID":"a01a3b28-aaff-4cda-9908-f08d0d675669","Type":"ContainerStarted","Data":"575e634b38d4fbeb51b7b2e0d82f7bf9728ff6c31341a297351e6571667e42cd"} Nov 29 14:52:16 crc kubenswrapper[4907]: I1129 14:52:16.966342 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a773-account-create-update-8ppqm" event={"ID":"61df189f-119b-46fd-877c-87265a18f8c5","Type":"ContainerStarted","Data":"0f98006d0aeeadd5faa9989c662cd7e7ccfc65288a1c7b513f76e46839a63206"} Nov 29 14:52:16 crc kubenswrapper[4907]: I1129 14:52:16.974692 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-r68r9" event={"ID":"7e61658d-77a7-46d8-9718-b3077f6f5ff6","Type":"ContainerStarted","Data":"f04fdd161a4e393ddb14b14a33811c32fd2a57d102edf1a699be83f863560b10"} Nov 29 14:52:16 crc kubenswrapper[4907]: I1129 14:52:16.977774 4907 generic.go:334] "Generic (PLEG): container finished" podID="03f910f6-7e96-4d81-bf3f-5b7291b2da09" containerID="9b0ed0f0e53fc5a10c1673f198a2349d296f194971e6d10deaad5e2922bb0270" exitCode=0 Nov 29 14:52:16 crc kubenswrapper[4907]: I1129 14:52:16.977892 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kdt5z" event={"ID":"03f910f6-7e96-4d81-bf3f-5b7291b2da09","Type":"ContainerDied","Data":"9b0ed0f0e53fc5a10c1673f198a2349d296f194971e6d10deaad5e2922bb0270"} Nov 29 14:52:16 crc kubenswrapper[4907]: I1129 14:52:16.977908 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kdt5z" event={"ID":"03f910f6-7e96-4d81-bf3f-5b7291b2da09","Type":"ContainerStarted","Data":"b2cf4553ca9344e1a41d1545b4c5aa11e756a144495ee3dfa48a5c649362bbe8"} Nov 29 14:52:16 crc kubenswrapper[4907]: I1129 14:52:16.989607 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-a773-account-create-update-8ppqm" podStartSLOduration=1.989584955 podStartE2EDuration="1.989584955s" podCreationTimestamp="2025-11-29 14:52:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:52:16.98098316 +0000 UTC m=+1434.967820812" watchObservedRunningTime="2025-11-29 14:52:16.989584955 +0000 UTC m=+1434.976422607" Nov 29 14:52:17 crc kubenswrapper[4907]: I1129 14:52:16.994792 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fabb-account-create-update-wthdh" event={"ID":"0d19cf04-d99b-4e24-ae1d-f5bf69e3d926","Type":"ContainerStarted","Data":"1bdc683d80aab7f428a4388b76fb5bcd4bff366dbd2bfa8a3ac1f9e34677644e"} Nov 29 14:52:17 crc kubenswrapper[4907]: I1129 14:52:17.009822 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4909-account-create-update-nsndx" event={"ID":"35ca437b-20a0-410b-b071-101b1ebe27cb","Type":"ContainerStarted","Data":"2966d29da0ed0a7075bed88302ae20a00ef758c4a24505ca81821159ac90bef9"} Nov 29 14:52:17 crc kubenswrapper[4907]: I1129 14:52:17.060832 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-4909-account-create-update-nsndx" podStartSLOduration=3.060813185 podStartE2EDuration="3.060813185s" podCreationTimestamp="2025-11-29 14:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:52:17.028317449 +0000 UTC m=+1435.015155101" watchObservedRunningTime="2025-11-29 14:52:17.060813185 +0000 UTC m=+1435.047650837" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.033025 4907 generic.go:334] "Generic (PLEG): container finished" podID="35ca437b-20a0-410b-b071-101b1ebe27cb" containerID="c0a858029248c5eaacb368575a042e7b425868d82f5bbe1809bb92bbb43b7993" exitCode=0 Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.033073 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4909-account-create-update-nsndx" event={"ID":"35ca437b-20a0-410b-b071-101b1ebe27cb","Type":"ContainerDied","Data":"c0a858029248c5eaacb368575a042e7b425868d82f5bbe1809bb92bbb43b7993"} Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.040848 4907 generic.go:334] "Generic (PLEG): container finished" podID="a01a3b28-aaff-4cda-9908-f08d0d675669" containerID="b68cb13406d0a8017927cd3d80d4e40ee73682af42ebeb1b67d5730700576161" exitCode=0 Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.040901 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z87qn" event={"ID":"a01a3b28-aaff-4cda-9908-f08d0d675669","Type":"ContainerDied","Data":"b68cb13406d0a8017927cd3d80d4e40ee73682af42ebeb1b67d5730700576161"} Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.046383 4907 generic.go:334] "Generic (PLEG): container finished" podID="61df189f-119b-46fd-877c-87265a18f8c5" containerID="ac9f296b80c80c7e4491e969bf1720b0d272b9d26ceeb9092b3f0debe3ea4405" exitCode=0 Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.046471 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a773-account-create-update-8ppqm" event={"ID":"61df189f-119b-46fd-877c-87265a18f8c5","Type":"ContainerDied","Data":"ac9f296b80c80c7e4491e969bf1720b0d272b9d26ceeb9092b3f0debe3ea4405"} Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.062846 4907 generic.go:334] "Generic (PLEG): container finished" podID="7e61658d-77a7-46d8-9718-b3077f6f5ff6" containerID="5bf0c625548a2fc71ba165eb7ac738064a7a77ab243cd0a4d8c97a01c5255df8" exitCode=0 Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.062932 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-r68r9" event={"ID":"7e61658d-77a7-46d8-9718-b3077f6f5ff6","Type":"ContainerDied","Data":"5bf0c625548a2fc71ba165eb7ac738064a7a77ab243cd0a4d8c97a01c5255df8"} Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.067533 4907 generic.go:334] "Generic (PLEG): container finished" podID="0d19cf04-d99b-4e24-ae1d-f5bf69e3d926" containerID="61c63f7da93f96340fd0d6eb0c94c0c7d1b86db1fff9aac4c32497fef81c9ca5" exitCode=0 Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.067641 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fabb-account-create-update-wthdh" event={"ID":"0d19cf04-d99b-4e24-ae1d-f5bf69e3d926","Type":"ContainerDied","Data":"61c63f7da93f96340fd0d6eb0c94c0c7d1b86db1fff9aac4c32497fef81c9ca5"} Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.905723 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-cdc66dd5-7pf8b"] Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.907892 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.920606 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-cdc66dd5-7pf8b"] Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.933768 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-668d798c8-5zhvp"] Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.935305 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.951546 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-c8fd9b956-4rqpf"] Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.953359 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.961593 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-668d798c8-5zhvp"] Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.969026 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-config-data-custom\") pod \"heat-engine-cdc66dd5-7pf8b\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.969093 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-config-data\") pod \"heat-cfnapi-c8fd9b956-4rqpf\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.969124 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-config-data\") pod \"heat-api-668d798c8-5zhvp\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.969167 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-combined-ca-bundle\") pod \"heat-api-668d798c8-5zhvp\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.970609 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttkl7\" (UniqueName: \"kubernetes.io/projected/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-kube-api-access-ttkl7\") pod \"heat-api-668d798c8-5zhvp\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.970644 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-config-data-custom\") pod \"heat-cfnapi-c8fd9b956-4rqpf\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.970730 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-combined-ca-bundle\") pod \"heat-engine-cdc66dd5-7pf8b\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.970750 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-config-data\") pod \"heat-engine-cdc66dd5-7pf8b\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.970764 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-config-data-custom\") pod \"heat-api-668d798c8-5zhvp\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.970804 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2qm4\" (UniqueName: \"kubernetes.io/projected/1d704079-49d7-40e3-ba60-f8cf9c281e90-kube-api-access-d2qm4\") pod \"heat-cfnapi-c8fd9b956-4rqpf\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.970962 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbxmp\" (UniqueName: \"kubernetes.io/projected/08229553-e114-45c9-a109-b01223241912-kube-api-access-mbxmp\") pod \"heat-engine-cdc66dd5-7pf8b\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.971041 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-combined-ca-bundle\") pod \"heat-cfnapi-c8fd9b956-4rqpf\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:18 crc kubenswrapper[4907]: I1129 14:52:18.993686 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-c8fd9b956-4rqpf"] Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.072260 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-combined-ca-bundle\") pod \"heat-engine-cdc66dd5-7pf8b\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.072310 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-config-data\") pod \"heat-engine-cdc66dd5-7pf8b\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.072332 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-config-data-custom\") pod \"heat-api-668d798c8-5zhvp\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.072370 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2qm4\" (UniqueName: \"kubernetes.io/projected/1d704079-49d7-40e3-ba60-f8cf9c281e90-kube-api-access-d2qm4\") pod \"heat-cfnapi-c8fd9b956-4rqpf\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.072395 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbxmp\" (UniqueName: \"kubernetes.io/projected/08229553-e114-45c9-a109-b01223241912-kube-api-access-mbxmp\") pod \"heat-engine-cdc66dd5-7pf8b\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.072423 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-combined-ca-bundle\") pod \"heat-cfnapi-c8fd9b956-4rqpf\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.072494 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-config-data-custom\") pod \"heat-engine-cdc66dd5-7pf8b\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.072549 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-config-data\") pod \"heat-cfnapi-c8fd9b956-4rqpf\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.072603 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-config-data\") pod \"heat-api-668d798c8-5zhvp\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.072650 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-combined-ca-bundle\") pod \"heat-api-668d798c8-5zhvp\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.072722 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttkl7\" (UniqueName: \"kubernetes.io/projected/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-kube-api-access-ttkl7\") pod \"heat-api-668d798c8-5zhvp\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.072746 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-config-data-custom\") pod \"heat-cfnapi-c8fd9b956-4rqpf\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.081349 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-config-data-custom\") pod \"heat-engine-cdc66dd5-7pf8b\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.081729 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-config-data\") pod \"heat-api-668d798c8-5zhvp\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.082256 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-combined-ca-bundle\") pod \"heat-api-668d798c8-5zhvp\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.082274 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-combined-ca-bundle\") pod \"heat-cfnapi-c8fd9b956-4rqpf\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.082485 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-config-data\") pod \"heat-engine-cdc66dd5-7pf8b\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.083054 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-combined-ca-bundle\") pod \"heat-engine-cdc66dd5-7pf8b\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.083528 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-config-data-custom\") pod \"heat-api-668d798c8-5zhvp\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.087206 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-config-data-custom\") pod \"heat-cfnapi-c8fd9b956-4rqpf\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.093540 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-config-data\") pod \"heat-cfnapi-c8fd9b956-4rqpf\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.101124 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2qm4\" (UniqueName: \"kubernetes.io/projected/1d704079-49d7-40e3-ba60-f8cf9c281e90-kube-api-access-d2qm4\") pod \"heat-cfnapi-c8fd9b956-4rqpf\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.101613 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbxmp\" (UniqueName: \"kubernetes.io/projected/08229553-e114-45c9-a109-b01223241912-kube-api-access-mbxmp\") pod \"heat-engine-cdc66dd5-7pf8b\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.110321 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttkl7\" (UniqueName: \"kubernetes.io/projected/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-kube-api-access-ttkl7\") pod \"heat-api-668d798c8-5zhvp\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.261923 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.262194 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.289821 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.737999 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kdt5z" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.746199 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r68r9" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.893404 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e61658d-77a7-46d8-9718-b3077f6f5ff6-operator-scripts\") pod \"7e61658d-77a7-46d8-9718-b3077f6f5ff6\" (UID: \"7e61658d-77a7-46d8-9718-b3077f6f5ff6\") " Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.893475 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pnhr\" (UniqueName: \"kubernetes.io/projected/03f910f6-7e96-4d81-bf3f-5b7291b2da09-kube-api-access-8pnhr\") pod \"03f910f6-7e96-4d81-bf3f-5b7291b2da09\" (UID: \"03f910f6-7e96-4d81-bf3f-5b7291b2da09\") " Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.893514 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fpsw\" (UniqueName: \"kubernetes.io/projected/7e61658d-77a7-46d8-9718-b3077f6f5ff6-kube-api-access-6fpsw\") pod \"7e61658d-77a7-46d8-9718-b3077f6f5ff6\" (UID: \"7e61658d-77a7-46d8-9718-b3077f6f5ff6\") " Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.893846 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03f910f6-7e96-4d81-bf3f-5b7291b2da09-operator-scripts\") pod \"03f910f6-7e96-4d81-bf3f-5b7291b2da09\" (UID: \"03f910f6-7e96-4d81-bf3f-5b7291b2da09\") " Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.894053 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e61658d-77a7-46d8-9718-b3077f6f5ff6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e61658d-77a7-46d8-9718-b3077f6f5ff6" (UID: "7e61658d-77a7-46d8-9718-b3077f6f5ff6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.894703 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e61658d-77a7-46d8-9718-b3077f6f5ff6-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.894743 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03f910f6-7e96-4d81-bf3f-5b7291b2da09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03f910f6-7e96-4d81-bf3f-5b7291b2da09" (UID: "03f910f6-7e96-4d81-bf3f-5b7291b2da09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.897976 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e61658d-77a7-46d8-9718-b3077f6f5ff6-kube-api-access-6fpsw" (OuterVolumeSpecName: "kube-api-access-6fpsw") pod "7e61658d-77a7-46d8-9718-b3077f6f5ff6" (UID: "7e61658d-77a7-46d8-9718-b3077f6f5ff6"). InnerVolumeSpecName "kube-api-access-6fpsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:19 crc kubenswrapper[4907]: I1129 14:52:19.899120 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f910f6-7e96-4d81-bf3f-5b7291b2da09-kube-api-access-8pnhr" (OuterVolumeSpecName: "kube-api-access-8pnhr") pod "03f910f6-7e96-4d81-bf3f-5b7291b2da09" (UID: "03f910f6-7e96-4d81-bf3f-5b7291b2da09"). InnerVolumeSpecName "kube-api-access-8pnhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:19.998954 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pnhr\" (UniqueName: \"kubernetes.io/projected/03f910f6-7e96-4d81-bf3f-5b7291b2da09-kube-api-access-8pnhr\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:19.999197 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fpsw\" (UniqueName: \"kubernetes.io/projected/7e61658d-77a7-46d8-9718-b3077f6f5ff6-kube-api-access-6fpsw\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:19.999207 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03f910f6-7e96-4d81-bf3f-5b7291b2da09-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.130447 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fabb-account-create-update-wthdh" event={"ID":"0d19cf04-d99b-4e24-ae1d-f5bf69e3d926","Type":"ContainerDied","Data":"1bdc683d80aab7f428a4388b76fb5bcd4bff366dbd2bfa8a3ac1f9e34677644e"} Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.131316 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bdc683d80aab7f428a4388b76fb5bcd4bff366dbd2bfa8a3ac1f9e34677644e" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.132910 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4909-account-create-update-nsndx" event={"ID":"35ca437b-20a0-410b-b071-101b1ebe27cb","Type":"ContainerDied","Data":"2966d29da0ed0a7075bed88302ae20a00ef758c4a24505ca81821159ac90bef9"} Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.133013 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2966d29da0ed0a7075bed88302ae20a00ef758c4a24505ca81821159ac90bef9" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.142165 4907 generic.go:334] "Generic (PLEG): container finished" podID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerID="bb5021954c0564f02f26a1e397494dbee334ab66f26e5cd69e9f1ab394875cff" exitCode=137 Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.142183 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7c3295c-d537-4302-80c1-ce39f0f4fcb4","Type":"ContainerDied","Data":"bb5021954c0564f02f26a1e397494dbee334ab66f26e5cd69e9f1ab394875cff"} Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.145790 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-r68r9" event={"ID":"7e61658d-77a7-46d8-9718-b3077f6f5ff6","Type":"ContainerDied","Data":"f04fdd161a4e393ddb14b14a33811c32fd2a57d102edf1a699be83f863560b10"} Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.146374 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f04fdd161a4e393ddb14b14a33811c32fd2a57d102edf1a699be83f863560b10" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.146455 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r68r9" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.152362 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kdt5z" event={"ID":"03f910f6-7e96-4d81-bf3f-5b7291b2da09","Type":"ContainerDied","Data":"b2cf4553ca9344e1a41d1545b4c5aa11e756a144495ee3dfa48a5c649362bbe8"} Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.152394 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2cf4553ca9344e1a41d1545b4c5aa11e756a144495ee3dfa48a5c649362bbe8" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.152458 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kdt5z" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.396982 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fabb-account-create-update-wthdh" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.435802 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4909-account-create-update-nsndx" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.460784 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z87qn" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.468794 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a773-account-create-update-8ppqm" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.510220 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d19cf04-d99b-4e24-ae1d-f5bf69e3d926-operator-scripts\") pod \"0d19cf04-d99b-4e24-ae1d-f5bf69e3d926\" (UID: \"0d19cf04-d99b-4e24-ae1d-f5bf69e3d926\") " Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.510490 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpk9s\" (UniqueName: \"kubernetes.io/projected/0d19cf04-d99b-4e24-ae1d-f5bf69e3d926-kube-api-access-fpk9s\") pod \"0d19cf04-d99b-4e24-ae1d-f5bf69e3d926\" (UID: \"0d19cf04-d99b-4e24-ae1d-f5bf69e3d926\") " Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.511485 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d19cf04-d99b-4e24-ae1d-f5bf69e3d926-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d19cf04-d99b-4e24-ae1d-f5bf69e3d926" (UID: "0d19cf04-d99b-4e24-ae1d-f5bf69e3d926"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.519921 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d19cf04-d99b-4e24-ae1d-f5bf69e3d926-kube-api-access-fpk9s" (OuterVolumeSpecName: "kube-api-access-fpk9s") pod "0d19cf04-d99b-4e24-ae1d-f5bf69e3d926" (UID: "0d19cf04-d99b-4e24-ae1d-f5bf69e3d926"). InnerVolumeSpecName "kube-api-access-fpk9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.612245 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61df189f-119b-46fd-877c-87265a18f8c5-operator-scripts\") pod \"61df189f-119b-46fd-877c-87265a18f8c5\" (UID: \"61df189f-119b-46fd-877c-87265a18f8c5\") " Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.612392 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxqqj\" (UniqueName: \"kubernetes.io/projected/a01a3b28-aaff-4cda-9908-f08d0d675669-kube-api-access-vxqqj\") pod \"a01a3b28-aaff-4cda-9908-f08d0d675669\" (UID: \"a01a3b28-aaff-4cda-9908-f08d0d675669\") " Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.612567 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddngw\" (UniqueName: \"kubernetes.io/projected/61df189f-119b-46fd-877c-87265a18f8c5-kube-api-access-ddngw\") pod \"61df189f-119b-46fd-877c-87265a18f8c5\" (UID: \"61df189f-119b-46fd-877c-87265a18f8c5\") " Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.612616 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a01a3b28-aaff-4cda-9908-f08d0d675669-operator-scripts\") pod \"a01a3b28-aaff-4cda-9908-f08d0d675669\" (UID: \"a01a3b28-aaff-4cda-9908-f08d0d675669\") " Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.612695 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61df189f-119b-46fd-877c-87265a18f8c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "61df189f-119b-46fd-877c-87265a18f8c5" (UID: "61df189f-119b-46fd-877c-87265a18f8c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.612711 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gx6j\" (UniqueName: \"kubernetes.io/projected/35ca437b-20a0-410b-b071-101b1ebe27cb-kube-api-access-9gx6j\") pod \"35ca437b-20a0-410b-b071-101b1ebe27cb\" (UID: \"35ca437b-20a0-410b-b071-101b1ebe27cb\") " Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.612884 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ca437b-20a0-410b-b071-101b1ebe27cb-operator-scripts\") pod \"35ca437b-20a0-410b-b071-101b1ebe27cb\" (UID: \"35ca437b-20a0-410b-b071-101b1ebe27cb\") " Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.614024 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d19cf04-d99b-4e24-ae1d-f5bf69e3d926-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.614041 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61df189f-119b-46fd-877c-87265a18f8c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.614051 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpk9s\" (UniqueName: \"kubernetes.io/projected/0d19cf04-d99b-4e24-ae1d-f5bf69e3d926-kube-api-access-fpk9s\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.617990 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35ca437b-20a0-410b-b071-101b1ebe27cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35ca437b-20a0-410b-b071-101b1ebe27cb" (UID: "35ca437b-20a0-410b-b071-101b1ebe27cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.620171 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01a3b28-aaff-4cda-9908-f08d0d675669-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a01a3b28-aaff-4cda-9908-f08d0d675669" (UID: "a01a3b28-aaff-4cda-9908-f08d0d675669"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.629134 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61df189f-119b-46fd-877c-87265a18f8c5-kube-api-access-ddngw" (OuterVolumeSpecName: "kube-api-access-ddngw") pod "61df189f-119b-46fd-877c-87265a18f8c5" (UID: "61df189f-119b-46fd-877c-87265a18f8c5"). InnerVolumeSpecName "kube-api-access-ddngw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.630832 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ca437b-20a0-410b-b071-101b1ebe27cb-kube-api-access-9gx6j" (OuterVolumeSpecName: "kube-api-access-9gx6j") pod "35ca437b-20a0-410b-b071-101b1ebe27cb" (UID: "35ca437b-20a0-410b-b071-101b1ebe27cb"). InnerVolumeSpecName "kube-api-access-9gx6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.639146 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01a3b28-aaff-4cda-9908-f08d0d675669-kube-api-access-vxqqj" (OuterVolumeSpecName: "kube-api-access-vxqqj") pod "a01a3b28-aaff-4cda-9908-f08d0d675669" (UID: "a01a3b28-aaff-4cda-9908-f08d0d675669"). InnerVolumeSpecName "kube-api-access-vxqqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.717130 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gx6j\" (UniqueName: \"kubernetes.io/projected/35ca437b-20a0-410b-b071-101b1ebe27cb-kube-api-access-9gx6j\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.717159 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ca437b-20a0-410b-b071-101b1ebe27cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.717169 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxqqj\" (UniqueName: \"kubernetes.io/projected/a01a3b28-aaff-4cda-9908-f08d0d675669-kube-api-access-vxqqj\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.717177 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddngw\" (UniqueName: \"kubernetes.io/projected/61df189f-119b-46fd-877c-87265a18f8c5-kube-api-access-ddngw\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.717186 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a01a3b28-aaff-4cda-9908-f08d0d675669-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.834755 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.913358 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-c8fd9b956-4rqpf"] Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.925724 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-cdc66dd5-7pf8b"] Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.935221 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-scripts\") pod \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.935253 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7vb4\" (UniqueName: \"kubernetes.io/projected/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-kube-api-access-q7vb4\") pod \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.935269 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-sg-core-conf-yaml\") pod \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.935350 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-combined-ca-bundle\") pod \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.935512 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-log-httpd\") pod \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.938531 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-run-httpd\") pod \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.938677 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-config-data\") pod \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\" (UID: \"f7c3295c-d537-4302-80c1-ce39f0f4fcb4\") " Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.939381 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-scripts" (OuterVolumeSpecName: "scripts") pod "f7c3295c-d537-4302-80c1-ce39f0f4fcb4" (UID: "f7c3295c-d537-4302-80c1-ce39f0f4fcb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.939575 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f7c3295c-d537-4302-80c1-ce39f0f4fcb4" (UID: "f7c3295c-d537-4302-80c1-ce39f0f4fcb4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.940281 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-kube-api-access-q7vb4" (OuterVolumeSpecName: "kube-api-access-q7vb4") pod "f7c3295c-d537-4302-80c1-ce39f0f4fcb4" (UID: "f7c3295c-d537-4302-80c1-ce39f0f4fcb4"). InnerVolumeSpecName "kube-api-access-q7vb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.940280 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f7c3295c-d537-4302-80c1-ce39f0f4fcb4" (UID: "f7c3295c-d537-4302-80c1-ce39f0f4fcb4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.942916 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-668d798c8-5zhvp"] Nov 29 14:52:20 crc kubenswrapper[4907]: I1129 14:52:20.969905 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f7c3295c-d537-4302-80c1-ce39f0f4fcb4" (UID: "f7c3295c-d537-4302-80c1-ce39f0f4fcb4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.025551 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7c3295c-d537-4302-80c1-ce39f0f4fcb4" (UID: "f7c3295c-d537-4302-80c1-ce39f0f4fcb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.042756 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.042798 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7vb4\" (UniqueName: \"kubernetes.io/projected/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-kube-api-access-q7vb4\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.042808 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.042819 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.042828 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.042836 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.055004 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-config-data" (OuterVolumeSpecName: "config-data") pod "f7c3295c-d537-4302-80c1-ce39f0f4fcb4" (UID: "f7c3295c-d537-4302-80c1-ce39f0f4fcb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.144880 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c3295c-d537-4302-80c1-ce39f0f4fcb4-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.170578 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" event={"ID":"1d704079-49d7-40e3-ba60-f8cf9c281e90","Type":"ContainerStarted","Data":"44f6a7fe15e89be8e31d38b2fc68ed4f8fdb0e78be0ccc1223b5a03202af20f2"} Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.172186 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-668d798c8-5zhvp" event={"ID":"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa","Type":"ContainerStarted","Data":"61b5c0e9cf1fa55dc5b5c3e2c66f304860bac83954212d9e323cf34ea8e77ad1"} Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.174398 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z87qn" event={"ID":"a01a3b28-aaff-4cda-9908-f08d0d675669","Type":"ContainerDied","Data":"575e634b38d4fbeb51b7b2e0d82f7bf9728ff6c31341a297351e6571667e42cd"} Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.174453 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="575e634b38d4fbeb51b7b2e0d82f7bf9728ff6c31341a297351e6571667e42cd" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.174546 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z87qn" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.189263 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a773-account-create-update-8ppqm" event={"ID":"61df189f-119b-46fd-877c-87265a18f8c5","Type":"ContainerDied","Data":"0f98006d0aeeadd5faa9989c662cd7e7ccfc65288a1c7b513f76e46839a63206"} Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.189303 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f98006d0aeeadd5faa9989c662cd7e7ccfc65288a1c7b513f76e46839a63206" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.189387 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a773-account-create-update-8ppqm" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.191737 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-cdc66dd5-7pf8b" event={"ID":"08229553-e114-45c9-a109-b01223241912","Type":"ContainerStarted","Data":"d596db3276e7955a15284e266b1a29e22a79ae44a6e5858fefd5ef09c31529a0"} Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.196962 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-568fdbf79-c6gqs" event={"ID":"e7817d43-2e6d-4f73-a592-22cd7fcf8787","Type":"ContainerStarted","Data":"448ab187901a1ac84f18411da92d5a5483cf15c9e2fa2ae73a49c193ddd4177e"} Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.198834 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.211824 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7c3295c-d537-4302-80c1-ce39f0f4fcb4","Type":"ContainerDied","Data":"499711dd8cfbdb0f523b9fc9312dece2df6849116f91146861cc3a366233d358"} Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.211885 4907 scope.go:117] "RemoveContainer" containerID="bb5021954c0564f02f26a1e397494dbee334ab66f26e5cd69e9f1ab394875cff" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.211951 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.217990 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4909-account-create-update-nsndx" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.218149 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5c474cb7d8-pslcd" event={"ID":"1234755e-c8a2-4e9a-98ee-b700f1703728","Type":"ContainerStarted","Data":"805aa47d0db7cfa30f6460c636ee07e577de27bc69e4471dda81a059beb8df04"} Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.218997 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.219067 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fabb-account-create-update-wthdh" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.285558 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-568fdbf79-c6gqs" podStartSLOduration=3.4048995 podStartE2EDuration="9.285538694s" podCreationTimestamp="2025-11-29 14:52:12 +0000 UTC" firstStartedPulling="2025-11-29 14:52:13.910712294 +0000 UTC m=+1431.897549946" lastFinishedPulling="2025-11-29 14:52:19.791351488 +0000 UTC m=+1437.778189140" observedRunningTime="2025-11-29 14:52:21.23487702 +0000 UTC m=+1439.221714672" watchObservedRunningTime="2025-11-29 14:52:21.285538694 +0000 UTC m=+1439.272376346" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.295893 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5c474cb7d8-pslcd" podStartSLOduration=3.755470942 podStartE2EDuration="9.295876499s" podCreationTimestamp="2025-11-29 14:52:12 +0000 UTC" firstStartedPulling="2025-11-29 14:52:14.250946791 +0000 UTC m=+1432.237784443" lastFinishedPulling="2025-11-29 14:52:19.791352358 +0000 UTC m=+1437.778190000" observedRunningTime="2025-11-29 14:52:21.265845463 +0000 UTC m=+1439.252683105" watchObservedRunningTime="2025-11-29 14:52:21.295876499 +0000 UTC m=+1439.282714151" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.378245 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-568fdbf79-c6gqs"] Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.385686 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5c474cb7d8-pslcd"] Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.419269 4907 scope.go:117] "RemoveContainer" containerID="93978f213f0dd7390dc158c33afad7e0b5684fa268d23626f48362b017a603ae" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.461204 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-9f67f96b7-8krnv"] Nov 29 14:52:21 crc kubenswrapper[4907]: E1129 14:52:21.462033 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ca437b-20a0-410b-b071-101b1ebe27cb" containerName="mariadb-account-create-update" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.462052 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ca437b-20a0-410b-b071-101b1ebe27cb" containerName="mariadb-account-create-update" Nov 29 14:52:21 crc kubenswrapper[4907]: E1129 14:52:21.462093 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e61658d-77a7-46d8-9718-b3077f6f5ff6" containerName="mariadb-database-create" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.462100 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e61658d-77a7-46d8-9718-b3077f6f5ff6" containerName="mariadb-database-create" Nov 29 14:52:21 crc kubenswrapper[4907]: E1129 14:52:21.462116 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerName="sg-core" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.462122 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerName="sg-core" Nov 29 14:52:21 crc kubenswrapper[4907]: E1129 14:52:21.462139 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerName="proxy-httpd" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.462172 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerName="proxy-httpd" Nov 29 14:52:21 crc kubenswrapper[4907]: E1129 14:52:21.462187 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerName="ceilometer-notification-agent" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.462192 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerName="ceilometer-notification-agent" Nov 29 14:52:21 crc kubenswrapper[4907]: E1129 14:52:21.462205 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f910f6-7e96-4d81-bf3f-5b7291b2da09" containerName="mariadb-database-create" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.462211 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f910f6-7e96-4d81-bf3f-5b7291b2da09" containerName="mariadb-database-create" Nov 29 14:52:21 crc kubenswrapper[4907]: E1129 14:52:21.462219 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61df189f-119b-46fd-877c-87265a18f8c5" containerName="mariadb-account-create-update" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.462256 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="61df189f-119b-46fd-877c-87265a18f8c5" containerName="mariadb-account-create-update" Nov 29 14:52:21 crc kubenswrapper[4907]: E1129 14:52:21.462274 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01a3b28-aaff-4cda-9908-f08d0d675669" containerName="mariadb-database-create" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.462282 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01a3b28-aaff-4cda-9908-f08d0d675669" containerName="mariadb-database-create" Nov 29 14:52:21 crc kubenswrapper[4907]: E1129 14:52:21.462300 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d19cf04-d99b-4e24-ae1d-f5bf69e3d926" containerName="mariadb-account-create-update" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.462330 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d19cf04-d99b-4e24-ae1d-f5bf69e3d926" containerName="mariadb-account-create-update" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.462745 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d19cf04-d99b-4e24-ae1d-f5bf69e3d926" containerName="mariadb-account-create-update" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.462800 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e61658d-77a7-46d8-9718-b3077f6f5ff6" containerName="mariadb-database-create" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.462812 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a01a3b28-aaff-4cda-9908-f08d0d675669" containerName="mariadb-database-create" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.462831 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerName="sg-core" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.465919 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerName="proxy-httpd" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.465992 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" containerName="ceilometer-notification-agent" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.466055 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f910f6-7e96-4d81-bf3f-5b7291b2da09" containerName="mariadb-database-create" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.466074 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ca437b-20a0-410b-b071-101b1ebe27cb" containerName="mariadb-account-create-update" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.466086 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="61df189f-119b-46fd-877c-87265a18f8c5" containerName="mariadb-account-create-update" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.469457 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.473032 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.475242 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.501154 4907 scope.go:117] "RemoveContainer" containerID="5bb03f96ea58f6710f35fa0e0f2b34eaf307facac712b36e63bcc71d9ab89995" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.501598 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-9f67f96b7-8krnv"] Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.524497 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-54546bdf79-77p2l"] Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.526028 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.532045 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.532222 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.532789 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-54546bdf79-77p2l"] Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.554250 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-config-data\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.554294 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-internal-tls-certs\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.554317 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-config-data-custom\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.554339 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-config-data\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.554363 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htr2k\" (UniqueName: \"kubernetes.io/projected/2ea22f3e-15c5-4f6e-9269-1da424d29342-kube-api-access-htr2k\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.554379 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-config-data-custom\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.554410 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-public-tls-certs\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.554430 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-combined-ca-bundle\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.554521 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-combined-ca-bundle\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.554579 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-public-tls-certs\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.554598 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbp6w\" (UniqueName: \"kubernetes.io/projected/62227625-11c5-4d0a-b990-a1995069e259-kube-api-access-vbp6w\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.554643 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-internal-tls-certs\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.560643 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.571532 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.582941 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.586230 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.588021 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.589764 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.621133 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.657604 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-public-tls-certs\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.657648 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbp6w\" (UniqueName: \"kubernetes.io/projected/62227625-11c5-4d0a-b990-a1995069e259-kube-api-access-vbp6w\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.657688 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-internal-tls-certs\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.657748 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-config-data\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.657764 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-internal-tls-certs\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.657783 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-config-data-custom\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.657800 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-config-data\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.657827 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htr2k\" (UniqueName: \"kubernetes.io/projected/2ea22f3e-15c5-4f6e-9269-1da424d29342-kube-api-access-htr2k\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.657841 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-config-data-custom\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.657874 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-public-tls-certs\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.657893 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-combined-ca-bundle\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.657947 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-combined-ca-bundle\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.662476 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-internal-tls-certs\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.671564 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-public-tls-certs\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.672861 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-config-data\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.675566 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-combined-ca-bundle\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.676607 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-combined-ca-bundle\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.677592 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-config-data\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.678191 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-config-data-custom\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.678466 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-config-data-custom\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.681515 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-public-tls-certs\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.686428 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htr2k\" (UniqueName: \"kubernetes.io/projected/2ea22f3e-15c5-4f6e-9269-1da424d29342-kube-api-access-htr2k\") pod \"heat-api-54546bdf79-77p2l\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.700188 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbp6w\" (UniqueName: \"kubernetes.io/projected/62227625-11c5-4d0a-b990-a1995069e259-kube-api-access-vbp6w\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: E1129 14:52:21.724593 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d19cf04_d99b_4e24_ae1d_f5bf69e3d926.slice/crio-1bdc683d80aab7f428a4388b76fb5bcd4bff366dbd2bfa8a3ac1f9e34677644e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda01a3b28_aaff_4cda_9908_f08d0d675669.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d19cf04_d99b_4e24_ae1d_f5bf69e3d926.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda01a3b28_aaff_4cda_9908_f08d0d675669.slice/crio-575e634b38d4fbeb51b7b2e0d82f7bf9728ff6c31341a297351e6571667e42cd\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61df189f_119b_46fd_877c_87265a18f8c5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35ca437b_20a0_410b_b071_101b1ebe27cb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7c3295c_d537_4302_80c1_ce39f0f4fcb4.slice/crio-499711dd8cfbdb0f523b9fc9312dece2df6849116f91146861cc3a366233d358\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35ca437b_20a0_410b_b071_101b1ebe27cb.slice/crio-2966d29da0ed0a7075bed88302ae20a00ef758c4a24505ca81821159ac90bef9\": RecentStats: unable to find data in memory cache]" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.739652 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-internal-tls-certs\") pod \"heat-cfnapi-9f67f96b7-8krnv\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.760922 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v9nz\" (UniqueName: \"kubernetes.io/projected/25be0daa-85b7-4739-a025-6d41b05442a0-kube-api-access-7v9nz\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.761284 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25be0daa-85b7-4739-a025-6d41b05442a0-run-httpd\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.761380 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.761628 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-config-data\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.761708 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25be0daa-85b7-4739-a025-6d41b05442a0-log-httpd\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.763556 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-scripts\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.763700 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.853982 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.864880 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.867320 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.867503 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v9nz\" (UniqueName: \"kubernetes.io/projected/25be0daa-85b7-4739-a025-6d41b05442a0-kube-api-access-7v9nz\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.867531 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25be0daa-85b7-4739-a025-6d41b05442a0-run-httpd\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.867561 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.867578 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-config-data\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.867603 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25be0daa-85b7-4739-a025-6d41b05442a0-log-httpd\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.867631 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-scripts\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.869386 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25be0daa-85b7-4739-a025-6d41b05442a0-run-httpd\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.869626 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25be0daa-85b7-4739-a025-6d41b05442a0-log-httpd\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.876646 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.877627 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-config-data\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.884168 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-scripts\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.884622 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v9nz\" (UniqueName: \"kubernetes.io/projected/25be0daa-85b7-4739-a025-6d41b05442a0-kube-api-access-7v9nz\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:21 crc kubenswrapper[4907]: I1129 14:52:21.885494 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " pod="openstack/ceilometer-0" Nov 29 14:52:22 crc kubenswrapper[4907]: I1129 14:52:22.128150 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:52:22 crc kubenswrapper[4907]: I1129 14:52:22.272715 4907 generic.go:334] "Generic (PLEG): container finished" podID="3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" containerID="a1b18e1f59ac5831fbcffee0aae425530e6b510a1e4bf95eb3093806b27866a8" exitCode=1 Nov 29 14:52:22 crc kubenswrapper[4907]: I1129 14:52:22.272773 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-668d798c8-5zhvp" event={"ID":"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa","Type":"ContainerDied","Data":"a1b18e1f59ac5831fbcffee0aae425530e6b510a1e4bf95eb3093806b27866a8"} Nov 29 14:52:22 crc kubenswrapper[4907]: I1129 14:52:22.273479 4907 scope.go:117] "RemoveContainer" containerID="a1b18e1f59ac5831fbcffee0aae425530e6b510a1e4bf95eb3093806b27866a8" Nov 29 14:52:22 crc kubenswrapper[4907]: I1129 14:52:22.294832 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-cdc66dd5-7pf8b" event={"ID":"08229553-e114-45c9-a109-b01223241912","Type":"ContainerStarted","Data":"297d0cd5ee3cbab6d0c630d62813378481056594efe1d066186f89706e7980ac"} Nov 29 14:52:22 crc kubenswrapper[4907]: I1129 14:52:22.298525 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:22 crc kubenswrapper[4907]: I1129 14:52:22.327707 4907 generic.go:334] "Generic (PLEG): container finished" podID="1d704079-49d7-40e3-ba60-f8cf9c281e90" containerID="18df76db83d6ae2a4a06d10253e796e0de5d103b9ad8d07bbba3215a5fca6150" exitCode=1 Nov 29 14:52:22 crc kubenswrapper[4907]: I1129 14:52:22.328316 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" event={"ID":"1d704079-49d7-40e3-ba60-f8cf9c281e90","Type":"ContainerDied","Data":"18df76db83d6ae2a4a06d10253e796e0de5d103b9ad8d07bbba3215a5fca6150"} Nov 29 14:52:22 crc kubenswrapper[4907]: I1129 14:52:22.328947 4907 scope.go:117] "RemoveContainer" containerID="18df76db83d6ae2a4a06d10253e796e0de5d103b9ad8d07bbba3215a5fca6150" Nov 29 14:52:22 crc kubenswrapper[4907]: I1129 14:52:22.335335 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-cdc66dd5-7pf8b" podStartSLOduration=4.335311783 podStartE2EDuration="4.335311783s" podCreationTimestamp="2025-11-29 14:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:52:22.328227881 +0000 UTC m=+1440.315065543" watchObservedRunningTime="2025-11-29 14:52:22.335311783 +0000 UTC m=+1440.322149435" Nov 29 14:52:22 crc kubenswrapper[4907]: I1129 14:52:22.389204 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-9f67f96b7-8krnv"] Nov 29 14:52:22 crc kubenswrapper[4907]: I1129 14:52:22.519551 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c3295c-d537-4302-80c1-ce39f0f4fcb4" path="/var/lib/kubelet/pods/f7c3295c-d537-4302-80c1-ce39f0f4fcb4/volumes" Nov 29 14:52:22 crc kubenswrapper[4907]: I1129 14:52:22.570200 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-54546bdf79-77p2l"] Nov 29 14:52:22 crc kubenswrapper[4907]: W1129 14:52:22.582217 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ea22f3e_15c5_4f6e_9269_1da424d29342.slice/crio-6997f44751d0223ce751251fb0e639903ab1fd2ffcfae0430743d397842a4e59 WatchSource:0}: Error finding container 6997f44751d0223ce751251fb0e639903ab1fd2ffcfae0430743d397842a4e59: Status 404 returned error can't find the container with id 6997f44751d0223ce751251fb0e639903ab1fd2ffcfae0430743d397842a4e59 Nov 29 14:52:22 crc kubenswrapper[4907]: I1129 14:52:22.761481 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:52:22 crc kubenswrapper[4907]: W1129 14:52:22.770316 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25be0daa_85b7_4739_a025_6d41b05442a0.slice/crio-34a69ea94c2b125891a6ecede66920bfdbfe72f4b46803b2779adf93595d7a69 WatchSource:0}: Error finding container 34a69ea94c2b125891a6ecede66920bfdbfe72f4b46803b2779adf93595d7a69: Status 404 returned error can't find the container with id 34a69ea94c2b125891a6ecede66920bfdbfe72f4b46803b2779adf93595d7a69 Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.052601 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.114001 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2gnn4"] Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.114304 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" podUID="74afd491-5339-485f-8a90-91f658a5f98e" containerName="dnsmasq-dns" containerID="cri-o://3ed07bdb3db1e6d5e8c30418a10bb6df7ef82641a7b2bfb6911c24ef9bd299af" gracePeriod=10 Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.419279 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25be0daa-85b7-4739-a025-6d41b05442a0","Type":"ContainerStarted","Data":"34a69ea94c2b125891a6ecede66920bfdbfe72f4b46803b2779adf93595d7a69"} Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.462589 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54546bdf79-77p2l" event={"ID":"2ea22f3e-15c5-4f6e-9269-1da424d29342","Type":"ContainerStarted","Data":"35e7b30e09c58a06501159656c980ee970d673c53b74f12ff1639fc6aa15e539"} Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.462656 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54546bdf79-77p2l" event={"ID":"2ea22f3e-15c5-4f6e-9269-1da424d29342","Type":"ContainerStarted","Data":"6997f44751d0223ce751251fb0e639903ab1fd2ffcfae0430743d397842a4e59"} Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.462698 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.534114 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-54546bdf79-77p2l" podStartSLOduration=2.53409504 podStartE2EDuration="2.53409504s" podCreationTimestamp="2025-11-29 14:52:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:52:23.518114365 +0000 UTC m=+1441.504952027" watchObservedRunningTime="2025-11-29 14:52:23.53409504 +0000 UTC m=+1441.520932692" Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.551109 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-9f67f96b7-8krnv" event={"ID":"62227625-11c5-4d0a-b990-a1995069e259","Type":"ContainerStarted","Data":"289d1d6822408120efa26bdf6670d03e6d1d5d6da01a556794102056704114d8"} Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.551155 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-9f67f96b7-8krnv" event={"ID":"62227625-11c5-4d0a-b990-a1995069e259","Type":"ContainerStarted","Data":"7abf1bc478f3716cc213a7aee10c776620cb5f105fbfb112eaa3f99bb9ca8d17"} Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.551346 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.619022 4907 generic.go:334] "Generic (PLEG): container finished" podID="74afd491-5339-485f-8a90-91f658a5f98e" containerID="3ed07bdb3db1e6d5e8c30418a10bb6df7ef82641a7b2bfb6911c24ef9bd299af" exitCode=0 Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.619102 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" event={"ID":"74afd491-5339-485f-8a90-91f658a5f98e","Type":"ContainerDied","Data":"3ed07bdb3db1e6d5e8c30418a10bb6df7ef82641a7b2bfb6911c24ef9bd299af"} Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.623164 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-9f67f96b7-8krnv" podStartSLOduration=2.623149228 podStartE2EDuration="2.623149228s" podCreationTimestamp="2025-11-29 14:52:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:52:23.618433434 +0000 UTC m=+1441.605271086" watchObservedRunningTime="2025-11-29 14:52:23.623149228 +0000 UTC m=+1441.609986880" Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.627411 4907 generic.go:334] "Generic (PLEG): container finished" podID="1d704079-49d7-40e3-ba60-f8cf9c281e90" containerID="28e8ec775ce7d721c8aeab01c185bf0037e993010406c9f04642355591ad8d7f" exitCode=1 Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.627660 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" event={"ID":"1d704079-49d7-40e3-ba60-f8cf9c281e90","Type":"ContainerDied","Data":"28e8ec775ce7d721c8aeab01c185bf0037e993010406c9f04642355591ad8d7f"} Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.627697 4907 scope.go:117] "RemoveContainer" containerID="18df76db83d6ae2a4a06d10253e796e0de5d103b9ad8d07bbba3215a5fca6150" Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.628209 4907 scope.go:117] "RemoveContainer" containerID="28e8ec775ce7d721c8aeab01c185bf0037e993010406c9f04642355591ad8d7f" Nov 29 14:52:23 crc kubenswrapper[4907]: E1129 14:52:23.628568 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-c8fd9b956-4rqpf_openstack(1d704079-49d7-40e3-ba60-f8cf9c281e90)\"" pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" podUID="1d704079-49d7-40e3-ba60-f8cf9c281e90" Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.630568 4907 generic.go:334] "Generic (PLEG): container finished" podID="3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" containerID="e6bb4364a6f79a33542c91557405459b01314dfa7b1d833c4034f58d11b7f29f" exitCode=1 Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.631303 4907 scope.go:117] "RemoveContainer" containerID="e6bb4364a6f79a33542c91557405459b01314dfa7b1d833c4034f58d11b7f29f" Nov 29 14:52:23 crc kubenswrapper[4907]: E1129 14:52:23.631527 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-668d798c8-5zhvp_openstack(3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa)\"" pod="openstack/heat-api-668d798c8-5zhvp" podUID="3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.631555 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-668d798c8-5zhvp" event={"ID":"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa","Type":"ContainerDied","Data":"e6bb4364a6f79a33542c91557405459b01314dfa7b1d833c4034f58d11b7f29f"} Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.631663 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-568fdbf79-c6gqs" podUID="e7817d43-2e6d-4f73-a592-22cd7fcf8787" containerName="heat-cfnapi" containerID="cri-o://448ab187901a1ac84f18411da92d5a5483cf15c9e2fa2ae73a49c193ddd4177e" gracePeriod=60 Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.631790 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5c474cb7d8-pslcd" podUID="1234755e-c8a2-4e9a-98ee-b700f1703728" containerName="heat-api" containerID="cri-o://805aa47d0db7cfa30f6460c636ee07e577de27bc69e4471dda81a059beb8df04" gracePeriod=60 Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.744202 4907 scope.go:117] "RemoveContainer" containerID="a1b18e1f59ac5831fbcffee0aae425530e6b510a1e4bf95eb3093806b27866a8" Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.873196 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:23 crc kubenswrapper[4907]: I1129 14:52:23.880977 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6c8fc64d77-lnt4r" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.131666 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.262512 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.262561 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.280224 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-dns-swift-storage-0\") pod \"74afd491-5339-485f-8a90-91f658a5f98e\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.280282 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-ovsdbserver-sb\") pod \"74afd491-5339-485f-8a90-91f658a5f98e\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.280361 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-ovsdbserver-nb\") pod \"74afd491-5339-485f-8a90-91f658a5f98e\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.280521 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-dns-svc\") pod \"74afd491-5339-485f-8a90-91f658a5f98e\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.280567 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-config\") pod \"74afd491-5339-485f-8a90-91f658a5f98e\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.280588 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlf5m\" (UniqueName: \"kubernetes.io/projected/74afd491-5339-485f-8a90-91f658a5f98e-kube-api-access-zlf5m\") pod \"74afd491-5339-485f-8a90-91f658a5f98e\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.290709 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.290877 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.293112 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74afd491-5339-485f-8a90-91f658a5f98e-kube-api-access-zlf5m" (OuterVolumeSpecName: "kube-api-access-zlf5m") pod "74afd491-5339-485f-8a90-91f658a5f98e" (UID: "74afd491-5339-485f-8a90-91f658a5f98e"). InnerVolumeSpecName "kube-api-access-zlf5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.379254 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "74afd491-5339-485f-8a90-91f658a5f98e" (UID: "74afd491-5339-485f-8a90-91f658a5f98e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.383201 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlf5m\" (UniqueName: \"kubernetes.io/projected/74afd491-5339-485f-8a90-91f658a5f98e-kube-api-access-zlf5m\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.383224 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.424599 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "74afd491-5339-485f-8a90-91f658a5f98e" (UID: "74afd491-5339-485f-8a90-91f658a5f98e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.426036 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "74afd491-5339-485f-8a90-91f658a5f98e" (UID: "74afd491-5339-485f-8a90-91f658a5f98e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.434376 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "74afd491-5339-485f-8a90-91f658a5f98e" (UID: "74afd491-5339-485f-8a90-91f658a5f98e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.485503 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-config" (OuterVolumeSpecName: "config") pod "74afd491-5339-485f-8a90-91f658a5f98e" (UID: "74afd491-5339-485f-8a90-91f658a5f98e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.487145 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-config\") pod \"74afd491-5339-485f-8a90-91f658a5f98e\" (UID: \"74afd491-5339-485f-8a90-91f658a5f98e\") " Nov 29 14:52:24 crc kubenswrapper[4907]: W1129 14:52:24.487946 4907 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/74afd491-5339-485f-8a90-91f658a5f98e/volumes/kubernetes.io~configmap/config Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.487987 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-config" (OuterVolumeSpecName: "config") pod "74afd491-5339-485f-8a90-91f658a5f98e" (UID: "74afd491-5339-485f-8a90-91f658a5f98e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.490898 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.491017 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.491079 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.491149 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74afd491-5339-485f-8a90-91f658a5f98e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.553609 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.591333 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.592698 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-config-data-custom\") pod \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.592920 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-combined-ca-bundle\") pod \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.593012 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8vgc\" (UniqueName: \"kubernetes.io/projected/e7817d43-2e6d-4f73-a592-22cd7fcf8787-kube-api-access-m8vgc\") pod \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.593099 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-combined-ca-bundle\") pod \"1234755e-c8a2-4e9a-98ee-b700f1703728\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.593207 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snx2j\" (UniqueName: \"kubernetes.io/projected/1234755e-c8a2-4e9a-98ee-b700f1703728-kube-api-access-snx2j\") pod \"1234755e-c8a2-4e9a-98ee-b700f1703728\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.593426 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-config-data-custom\") pod \"1234755e-c8a2-4e9a-98ee-b700f1703728\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.593942 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-config-data\") pod \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\" (UID: \"e7817d43-2e6d-4f73-a592-22cd7fcf8787\") " Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.594540 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-config-data\") pod \"1234755e-c8a2-4e9a-98ee-b700f1703728\" (UID: \"1234755e-c8a2-4e9a-98ee-b700f1703728\") " Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.598671 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1234755e-c8a2-4e9a-98ee-b700f1703728-kube-api-access-snx2j" (OuterVolumeSpecName: "kube-api-access-snx2j") pod "1234755e-c8a2-4e9a-98ee-b700f1703728" (UID: "1234755e-c8a2-4e9a-98ee-b700f1703728"). InnerVolumeSpecName "kube-api-access-snx2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.599213 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1234755e-c8a2-4e9a-98ee-b700f1703728" (UID: "1234755e-c8a2-4e9a-98ee-b700f1703728"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.601854 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e7817d43-2e6d-4f73-a592-22cd7fcf8787" (UID: "e7817d43-2e6d-4f73-a592-22cd7fcf8787"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.606955 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7817d43-2e6d-4f73-a592-22cd7fcf8787-kube-api-access-m8vgc" (OuterVolumeSpecName: "kube-api-access-m8vgc") pod "e7817d43-2e6d-4f73-a592-22cd7fcf8787" (UID: "e7817d43-2e6d-4f73-a592-22cd7fcf8787"). InnerVolumeSpecName "kube-api-access-m8vgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.643317 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1234755e-c8a2-4e9a-98ee-b700f1703728" (UID: "1234755e-c8a2-4e9a-98ee-b700f1703728"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.649240 4907 scope.go:117] "RemoveContainer" containerID="e6bb4364a6f79a33542c91557405459b01314dfa7b1d833c4034f58d11b7f29f" Nov 29 14:52:24 crc kubenswrapper[4907]: E1129 14:52:24.649520 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-668d798c8-5zhvp_openstack(3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa)\"" pod="openstack/heat-api-668d798c8-5zhvp" podUID="3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.652843 4907 generic.go:334] "Generic (PLEG): container finished" podID="e7817d43-2e6d-4f73-a592-22cd7fcf8787" containerID="448ab187901a1ac84f18411da92d5a5483cf15c9e2fa2ae73a49c193ddd4177e" exitCode=0 Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.653053 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-568fdbf79-c6gqs" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.653081 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-568fdbf79-c6gqs" event={"ID":"e7817d43-2e6d-4f73-a592-22cd7fcf8787","Type":"ContainerDied","Data":"448ab187901a1ac84f18411da92d5a5483cf15c9e2fa2ae73a49c193ddd4177e"} Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.653111 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-568fdbf79-c6gqs" event={"ID":"e7817d43-2e6d-4f73-a592-22cd7fcf8787","Type":"ContainerDied","Data":"78d7135fa082f746944c869d0ff251b9f9b7ec5f1b261e4b532979ef7463cef8"} Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.653128 4907 scope.go:117] "RemoveContainer" containerID="448ab187901a1ac84f18411da92d5a5483cf15c9e2fa2ae73a49c193ddd4177e" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.659821 4907 generic.go:334] "Generic (PLEG): container finished" podID="1234755e-c8a2-4e9a-98ee-b700f1703728" containerID="805aa47d0db7cfa30f6460c636ee07e577de27bc69e4471dda81a059beb8df04" exitCode=0 Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.659909 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5c474cb7d8-pslcd" event={"ID":"1234755e-c8a2-4e9a-98ee-b700f1703728","Type":"ContainerDied","Data":"805aa47d0db7cfa30f6460c636ee07e577de27bc69e4471dda81a059beb8df04"} Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.659938 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5c474cb7d8-pslcd" event={"ID":"1234755e-c8a2-4e9a-98ee-b700f1703728","Type":"ContainerDied","Data":"7831697a8fbe52285e60692ee0c741357ca52c40b0e2ecbd376b3831537ad177"} Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.661909 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5c474cb7d8-pslcd" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.669332 4907 scope.go:117] "RemoveContainer" containerID="28e8ec775ce7d721c8aeab01c185bf0037e993010406c9f04642355591ad8d7f" Nov 29 14:52:24 crc kubenswrapper[4907]: E1129 14:52:24.669600 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-c8fd9b956-4rqpf_openstack(1d704079-49d7-40e3-ba60-f8cf9c281e90)\"" pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" podUID="1d704079-49d7-40e3-ba60-f8cf9c281e90" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.680273 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" event={"ID":"74afd491-5339-485f-8a90-91f658a5f98e","Type":"ContainerDied","Data":"af9d3be9fb96a1ae53e10934ea743891fba47e7d3267d54159cd00b19065e73b"} Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.680783 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-2gnn4" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.686693 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7817d43-2e6d-4f73-a592-22cd7fcf8787" (UID: "e7817d43-2e6d-4f73-a592-22cd7fcf8787"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.693105 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25be0daa-85b7-4739-a025-6d41b05442a0","Type":"ContainerStarted","Data":"4339968d81a085c4fc53d392127e1c227dcfb7b05ab4de24fe534bdda1d302ce"} Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.696790 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-config-data" (OuterVolumeSpecName: "config-data") pod "1234755e-c8a2-4e9a-98ee-b700f1703728" (UID: "1234755e-c8a2-4e9a-98ee-b700f1703728"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.696874 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-config-data" (OuterVolumeSpecName: "config-data") pod "e7817d43-2e6d-4f73-a592-22cd7fcf8787" (UID: "e7817d43-2e6d-4f73-a592-22cd7fcf8787"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.697525 4907 scope.go:117] "RemoveContainer" containerID="448ab187901a1ac84f18411da92d5a5483cf15c9e2fa2ae73a49c193ddd4177e" Nov 29 14:52:24 crc kubenswrapper[4907]: E1129 14:52:24.697918 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"448ab187901a1ac84f18411da92d5a5483cf15c9e2fa2ae73a49c193ddd4177e\": container with ID starting with 448ab187901a1ac84f18411da92d5a5483cf15c9e2fa2ae73a49c193ddd4177e not found: ID does not exist" containerID="448ab187901a1ac84f18411da92d5a5483cf15c9e2fa2ae73a49c193ddd4177e" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.697949 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"448ab187901a1ac84f18411da92d5a5483cf15c9e2fa2ae73a49c193ddd4177e"} err="failed to get container status \"448ab187901a1ac84f18411da92d5a5483cf15c9e2fa2ae73a49c193ddd4177e\": rpc error: code = NotFound desc = could not find container \"448ab187901a1ac84f18411da92d5a5483cf15c9e2fa2ae73a49c193ddd4177e\": container with ID starting with 448ab187901a1ac84f18411da92d5a5483cf15c9e2fa2ae73a49c193ddd4177e not found: ID does not exist" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.697969 4907 scope.go:117] "RemoveContainer" containerID="805aa47d0db7cfa30f6460c636ee07e577de27bc69e4471dda81a059beb8df04" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.704780 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.704807 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.704816 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8vgc\" (UniqueName: \"kubernetes.io/projected/e7817d43-2e6d-4f73-a592-22cd7fcf8787-kube-api-access-m8vgc\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.704826 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.704837 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snx2j\" (UniqueName: \"kubernetes.io/projected/1234755e-c8a2-4e9a-98ee-b700f1703728-kube-api-access-snx2j\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.704846 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.704856 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7817d43-2e6d-4f73-a592-22cd7fcf8787-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.704865 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1234755e-c8a2-4e9a-98ee-b700f1703728-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.716487 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2gnn4"] Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.726822 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-2gnn4"] Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.734004 4907 scope.go:117] "RemoveContainer" containerID="805aa47d0db7cfa30f6460c636ee07e577de27bc69e4471dda81a059beb8df04" Nov 29 14:52:24 crc kubenswrapper[4907]: E1129 14:52:24.734848 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"805aa47d0db7cfa30f6460c636ee07e577de27bc69e4471dda81a059beb8df04\": container with ID starting with 805aa47d0db7cfa30f6460c636ee07e577de27bc69e4471dda81a059beb8df04 not found: ID does not exist" containerID="805aa47d0db7cfa30f6460c636ee07e577de27bc69e4471dda81a059beb8df04" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.734907 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"805aa47d0db7cfa30f6460c636ee07e577de27bc69e4471dda81a059beb8df04"} err="failed to get container status \"805aa47d0db7cfa30f6460c636ee07e577de27bc69e4471dda81a059beb8df04\": rpc error: code = NotFound desc = could not find container \"805aa47d0db7cfa30f6460c636ee07e577de27bc69e4471dda81a059beb8df04\": container with ID starting with 805aa47d0db7cfa30f6460c636ee07e577de27bc69e4471dda81a059beb8df04 not found: ID does not exist" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.734940 4907 scope.go:117] "RemoveContainer" containerID="3ed07bdb3db1e6d5e8c30418a10bb6df7ef82641a7b2bfb6911c24ef9bd299af" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.754619 4907 scope.go:117] "RemoveContainer" containerID="6b6ad7c067dc71c1271a4723cfb266c0268f720ee6a7f89bc1b51c4f986103d9" Nov 29 14:52:24 crc kubenswrapper[4907]: I1129 14:52:24.994625 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-568fdbf79-c6gqs"] Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.007706 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-568fdbf79-c6gqs"] Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.042550 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5c474cb7d8-pslcd"] Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.071539 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5c474cb7d8-pslcd"] Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.376706 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dj5h6"] Nov 29 14:52:25 crc kubenswrapper[4907]: E1129 14:52:25.377450 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74afd491-5339-485f-8a90-91f658a5f98e" containerName="init" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.377466 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="74afd491-5339-485f-8a90-91f658a5f98e" containerName="init" Nov 29 14:52:25 crc kubenswrapper[4907]: E1129 14:52:25.377489 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1234755e-c8a2-4e9a-98ee-b700f1703728" containerName="heat-api" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.377496 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1234755e-c8a2-4e9a-98ee-b700f1703728" containerName="heat-api" Nov 29 14:52:25 crc kubenswrapper[4907]: E1129 14:52:25.377511 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7817d43-2e6d-4f73-a592-22cd7fcf8787" containerName="heat-cfnapi" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.377518 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7817d43-2e6d-4f73-a592-22cd7fcf8787" containerName="heat-cfnapi" Nov 29 14:52:25 crc kubenswrapper[4907]: E1129 14:52:25.377538 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74afd491-5339-485f-8a90-91f658a5f98e" containerName="dnsmasq-dns" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.377544 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="74afd491-5339-485f-8a90-91f658a5f98e" containerName="dnsmasq-dns" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.377754 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1234755e-c8a2-4e9a-98ee-b700f1703728" containerName="heat-api" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.377789 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="74afd491-5339-485f-8a90-91f658a5f98e" containerName="dnsmasq-dns" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.377815 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7817d43-2e6d-4f73-a592-22cd7fcf8787" containerName="heat-cfnapi" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.378596 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.380804 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.381351 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-lmwcv" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.381910 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.387237 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dj5h6"] Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.529816 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj7zc\" (UniqueName: \"kubernetes.io/projected/99873cd9-5727-4f0c-888d-ed6c6090abc1-kube-api-access-vj7zc\") pod \"nova-cell0-conductor-db-sync-dj5h6\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.529881 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-config-data\") pod \"nova-cell0-conductor-db-sync-dj5h6\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.529999 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-scripts\") pod \"nova-cell0-conductor-db-sync-dj5h6\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.530042 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dj5h6\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.631671 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj7zc\" (UniqueName: \"kubernetes.io/projected/99873cd9-5727-4f0c-888d-ed6c6090abc1-kube-api-access-vj7zc\") pod \"nova-cell0-conductor-db-sync-dj5h6\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.631716 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-config-data\") pod \"nova-cell0-conductor-db-sync-dj5h6\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.631871 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-scripts\") pod \"nova-cell0-conductor-db-sync-dj5h6\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.632829 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dj5h6\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.639492 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-config-data\") pod \"nova-cell0-conductor-db-sync-dj5h6\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.639823 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-scripts\") pod \"nova-cell0-conductor-db-sync-dj5h6\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.640172 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dj5h6\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.650087 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj7zc\" (UniqueName: \"kubernetes.io/projected/99873cd9-5727-4f0c-888d-ed6c6090abc1-kube-api-access-vj7zc\") pod \"nova-cell0-conductor-db-sync-dj5h6\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.703699 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.723111 4907 scope.go:117] "RemoveContainer" containerID="28e8ec775ce7d721c8aeab01c185bf0037e993010406c9f04642355591ad8d7f" Nov 29 14:52:25 crc kubenswrapper[4907]: E1129 14:52:25.723633 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-c8fd9b956-4rqpf_openstack(1d704079-49d7-40e3-ba60-f8cf9c281e90)\"" pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" podUID="1d704079-49d7-40e3-ba60-f8cf9c281e90" Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.723956 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25be0daa-85b7-4739-a025-6d41b05442a0","Type":"ContainerStarted","Data":"3162d14750066bfcc87d5e1d02b99ef99d2b29568bbb386520efdde907655e45"} Nov 29 14:52:25 crc kubenswrapper[4907]: I1129 14:52:25.724463 4907 scope.go:117] "RemoveContainer" containerID="e6bb4364a6f79a33542c91557405459b01314dfa7b1d833c4034f58d11b7f29f" Nov 29 14:52:25 crc kubenswrapper[4907]: E1129 14:52:25.724712 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-668d798c8-5zhvp_openstack(3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa)\"" pod="openstack/heat-api-668d798c8-5zhvp" podUID="3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.446758 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dj5h6"] Nov 29 14:52:26 crc kubenswrapper[4907]: W1129 14:52:26.456779 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99873cd9_5727_4f0c_888d_ed6c6090abc1.slice/crio-286efb2c7c3b7d529d554995b2d73f48f1e13ca1e90f98dd25fb71eeec88419f WatchSource:0}: Error finding container 286efb2c7c3b7d529d554995b2d73f48f1e13ca1e90f98dd25fb71eeec88419f: Status 404 returned error can't find the container with id 286efb2c7c3b7d529d554995b2d73f48f1e13ca1e90f98dd25fb71eeec88419f Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.498101 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1234755e-c8a2-4e9a-98ee-b700f1703728" path="/var/lib/kubelet/pods/1234755e-c8a2-4e9a-98ee-b700f1703728/volumes" Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.498841 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74afd491-5339-485f-8a90-91f658a5f98e" path="/var/lib/kubelet/pods/74afd491-5339-485f-8a90-91f658a5f98e/volumes" Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.499416 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7817d43-2e6d-4f73-a592-22cd7fcf8787" path="/var/lib/kubelet/pods/e7817d43-2e6d-4f73-a592-22cd7fcf8787/volumes" Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.579042 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w5wt4"] Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.581430 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.599525 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5wt4"] Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.751255 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25be0daa-85b7-4739-a025-6d41b05442a0","Type":"ContainerStarted","Data":"daf6924a43b068a4d995e86eb12cc5b18490c89319507b46aa1823d782e267fa"} Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.753720 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dj5h6" event={"ID":"99873cd9-5727-4f0c-888d-ed6c6090abc1","Type":"ContainerStarted","Data":"286efb2c7c3b7d529d554995b2d73f48f1e13ca1e90f98dd25fb71eeec88419f"} Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.777477 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eed2c3ab-cd81-4930-853b-f7034be8e804-catalog-content\") pod \"redhat-operators-w5wt4\" (UID: \"eed2c3ab-cd81-4930-853b-f7034be8e804\") " pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.777523 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eed2c3ab-cd81-4930-853b-f7034be8e804-utilities\") pod \"redhat-operators-w5wt4\" (UID: \"eed2c3ab-cd81-4930-853b-f7034be8e804\") " pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.778333 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trw2p\" (UniqueName: \"kubernetes.io/projected/eed2c3ab-cd81-4930-853b-f7034be8e804-kube-api-access-trw2p\") pod \"redhat-operators-w5wt4\" (UID: \"eed2c3ab-cd81-4930-853b-f7034be8e804\") " pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.880533 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eed2c3ab-cd81-4930-853b-f7034be8e804-catalog-content\") pod \"redhat-operators-w5wt4\" (UID: \"eed2c3ab-cd81-4930-853b-f7034be8e804\") " pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.880618 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eed2c3ab-cd81-4930-853b-f7034be8e804-utilities\") pod \"redhat-operators-w5wt4\" (UID: \"eed2c3ab-cd81-4930-853b-f7034be8e804\") " pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.880994 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eed2c3ab-cd81-4930-853b-f7034be8e804-catalog-content\") pod \"redhat-operators-w5wt4\" (UID: \"eed2c3ab-cd81-4930-853b-f7034be8e804\") " pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.881050 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eed2c3ab-cd81-4930-853b-f7034be8e804-utilities\") pod \"redhat-operators-w5wt4\" (UID: \"eed2c3ab-cd81-4930-853b-f7034be8e804\") " pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.881180 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trw2p\" (UniqueName: \"kubernetes.io/projected/eed2c3ab-cd81-4930-853b-f7034be8e804-kube-api-access-trw2p\") pod \"redhat-operators-w5wt4\" (UID: \"eed2c3ab-cd81-4930-853b-f7034be8e804\") " pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:52:26 crc kubenswrapper[4907]: I1129 14:52:26.913834 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trw2p\" (UniqueName: \"kubernetes.io/projected/eed2c3ab-cd81-4930-853b-f7034be8e804-kube-api-access-trw2p\") pod \"redhat-operators-w5wt4\" (UID: \"eed2c3ab-cd81-4930-853b-f7034be8e804\") " pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:52:27 crc kubenswrapper[4907]: I1129 14:52:27.207318 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:52:33 crc kubenswrapper[4907]: I1129 14:52:33.094858 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:34 crc kubenswrapper[4907]: I1129 14:52:34.532357 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5wt4"] Nov 29 14:52:34 crc kubenswrapper[4907]: I1129 14:52:34.566311 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:52:34 crc kubenswrapper[4907]: I1129 14:52:34.650020 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-668d798c8-5zhvp"] Nov 29 14:52:34 crc kubenswrapper[4907]: I1129 14:52:34.852119 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"71aeb8b9-6bde-4a3e-a6f1-6d7c192490be","Type":"ContainerStarted","Data":"cfd34ed6fd286db1aef276b9791dc1a94032d9348b44d0ef5c8306ebfecc4652"} Nov 29 14:52:34 crc kubenswrapper[4907]: I1129 14:52:34.864310 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:52:34 crc kubenswrapper[4907]: I1129 14:52:34.887056 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25be0daa-85b7-4739-a025-6d41b05442a0","Type":"ContainerStarted","Data":"80965ab6093d55cdf93ca55abf0b6fc88c9882a1933ba5b285640ac92f53747e"} Nov 29 14:52:34 crc kubenswrapper[4907]: I1129 14:52:34.887998 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 14:52:34 crc kubenswrapper[4907]: I1129 14:52:34.893850 4907 generic.go:334] "Generic (PLEG): container finished" podID="eed2c3ab-cd81-4930-853b-f7034be8e804" containerID="e649d0a0548f44ffecfd6e098939697a671ae7bfe310536e15f493889c12b536" exitCode=0 Nov 29 14:52:34 crc kubenswrapper[4907]: I1129 14:52:34.893889 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5wt4" event={"ID":"eed2c3ab-cd81-4930-853b-f7034be8e804","Type":"ContainerDied","Data":"e649d0a0548f44ffecfd6e098939697a671ae7bfe310536e15f493889c12b536"} Nov 29 14:52:34 crc kubenswrapper[4907]: I1129 14:52:34.893909 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5wt4" event={"ID":"eed2c3ab-cd81-4930-853b-f7034be8e804","Type":"ContainerStarted","Data":"e97c859a1732c989fb2dd1cd8464e21a870994c92c11ae17a32e9bdedf0a163f"} Nov 29 14:52:34 crc kubenswrapper[4907]: I1129 14:52:34.914288 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.967657749 podStartE2EDuration="22.914222696s" podCreationTimestamp="2025-11-29 14:52:12 +0000 UTC" firstStartedPulling="2025-11-29 14:52:14.051294971 +0000 UTC m=+1432.038132623" lastFinishedPulling="2025-11-29 14:52:33.997859918 +0000 UTC m=+1451.984697570" observedRunningTime="2025-11-29 14:52:34.870478869 +0000 UTC m=+1452.857316521" watchObservedRunningTime="2025-11-29 14:52:34.914222696 +0000 UTC m=+1452.901060368" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.077228 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.859235267 podStartE2EDuration="14.077197971s" podCreationTimestamp="2025-11-29 14:52:21 +0000 UTC" firstStartedPulling="2025-11-29 14:52:22.775791457 +0000 UTC m=+1440.762629109" lastFinishedPulling="2025-11-29 14:52:33.993754161 +0000 UTC m=+1451.980591813" observedRunningTime="2025-11-29 14:52:34.963610624 +0000 UTC m=+1452.950448266" watchObservedRunningTime="2025-11-29 14:52:35.077197971 +0000 UTC m=+1453.064035623" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.138006 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-c8fd9b956-4rqpf"] Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.223706 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.299302 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-config-data\") pod \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.299421 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-combined-ca-bundle\") pod \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.299462 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttkl7\" (UniqueName: \"kubernetes.io/projected/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-kube-api-access-ttkl7\") pod \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.299581 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-config-data-custom\") pod \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\" (UID: \"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa\") " Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.306274 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" (UID: "3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.309747 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-kube-api-access-ttkl7" (OuterVolumeSpecName: "kube-api-access-ttkl7") pod "3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" (UID: "3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa"). InnerVolumeSpecName "kube-api-access-ttkl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.357093 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" (UID: "3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.402715 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.402750 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttkl7\" (UniqueName: \"kubernetes.io/projected/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-kube-api-access-ttkl7\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.402761 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.412555 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-config-data" (OuterVolumeSpecName: "config-data") pod "3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" (UID: "3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.504963 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.554945 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.709852 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-config-data-custom\") pod \"1d704079-49d7-40e3-ba60-f8cf9c281e90\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.709985 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-config-data\") pod \"1d704079-49d7-40e3-ba60-f8cf9c281e90\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.710077 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2qm4\" (UniqueName: \"kubernetes.io/projected/1d704079-49d7-40e3-ba60-f8cf9c281e90-kube-api-access-d2qm4\") pod \"1d704079-49d7-40e3-ba60-f8cf9c281e90\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.710092 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-combined-ca-bundle\") pod \"1d704079-49d7-40e3-ba60-f8cf9c281e90\" (UID: \"1d704079-49d7-40e3-ba60-f8cf9c281e90\") " Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.714797 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1d704079-49d7-40e3-ba60-f8cf9c281e90" (UID: "1d704079-49d7-40e3-ba60-f8cf9c281e90"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.716986 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d704079-49d7-40e3-ba60-f8cf9c281e90-kube-api-access-d2qm4" (OuterVolumeSpecName: "kube-api-access-d2qm4") pod "1d704079-49d7-40e3-ba60-f8cf9c281e90" (UID: "1d704079-49d7-40e3-ba60-f8cf9c281e90"). InnerVolumeSpecName "kube-api-access-d2qm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.761621 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d704079-49d7-40e3-ba60-f8cf9c281e90" (UID: "1d704079-49d7-40e3-ba60-f8cf9c281e90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.820107 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2qm4\" (UniqueName: \"kubernetes.io/projected/1d704079-49d7-40e3-ba60-f8cf9c281e90-kube-api-access-d2qm4\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.820144 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.820154 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.853706 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-config-data" (OuterVolumeSpecName: "config-data") pod "1d704079-49d7-40e3-ba60-f8cf9c281e90" (UID: "1d704079-49d7-40e3-ba60-f8cf9c281e90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.921891 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d704079-49d7-40e3-ba60-f8cf9c281e90-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.923734 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" event={"ID":"1d704079-49d7-40e3-ba60-f8cf9c281e90","Type":"ContainerDied","Data":"44f6a7fe15e89be8e31d38b2fc68ed4f8fdb0e78be0ccc1223b5a03202af20f2"} Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.923796 4907 scope.go:117] "RemoveContainer" containerID="28e8ec775ce7d721c8aeab01c185bf0037e993010406c9f04642355591ad8d7f" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.923927 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-c8fd9b956-4rqpf" Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.951002 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-668d798c8-5zhvp" event={"ID":"3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa","Type":"ContainerDied","Data":"61b5c0e9cf1fa55dc5b5c3e2c66f304860bac83954212d9e323cf34ea8e77ad1"} Nov 29 14:52:35 crc kubenswrapper[4907]: I1129 14:52:35.951088 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-668d798c8-5zhvp" Nov 29 14:52:36 crc kubenswrapper[4907]: I1129 14:52:36.044066 4907 scope.go:117] "RemoveContainer" containerID="e6bb4364a6f79a33542c91557405459b01314dfa7b1d833c4034f58d11b7f29f" Nov 29 14:52:36 crc kubenswrapper[4907]: I1129 14:52:36.074502 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-c8fd9b956-4rqpf"] Nov 29 14:52:36 crc kubenswrapper[4907]: I1129 14:52:36.093487 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-c8fd9b956-4rqpf"] Nov 29 14:52:36 crc kubenswrapper[4907]: I1129 14:52:36.106788 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-668d798c8-5zhvp"] Nov 29 14:52:36 crc kubenswrapper[4907]: I1129 14:52:36.123274 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-668d798c8-5zhvp"] Nov 29 14:52:36 crc kubenswrapper[4907]: I1129 14:52:36.492123 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d704079-49d7-40e3-ba60-f8cf9c281e90" path="/var/lib/kubelet/pods/1d704079-49d7-40e3-ba60-f8cf9c281e90/volumes" Nov 29 14:52:36 crc kubenswrapper[4907]: I1129 14:52:36.492673 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" path="/var/lib/kubelet/pods/3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa/volumes" Nov 29 14:52:36 crc kubenswrapper[4907]: I1129 14:52:36.976958 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5wt4" event={"ID":"eed2c3ab-cd81-4930-853b-f7034be8e804","Type":"ContainerStarted","Data":"121930e5bdcc1dbd3a849f2bea90ea602e1a66433d01584490b2cbd845c35b13"} Nov 29 14:52:37 crc kubenswrapper[4907]: I1129 14:52:37.493015 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:52:37 crc kubenswrapper[4907]: I1129 14:52:37.493187 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="ceilometer-central-agent" containerID="cri-o://4339968d81a085c4fc53d392127e1c227dcfb7b05ab4de24fe534bdda1d302ce" gracePeriod=30 Nov 29 14:52:37 crc kubenswrapper[4907]: I1129 14:52:37.493602 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="proxy-httpd" containerID="cri-o://80965ab6093d55cdf93ca55abf0b6fc88c9882a1933ba5b285640ac92f53747e" gracePeriod=30 Nov 29 14:52:37 crc kubenswrapper[4907]: I1129 14:52:37.493674 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="ceilometer-notification-agent" containerID="cri-o://3162d14750066bfcc87d5e1d02b99ef99d2b29568bbb386520efdde907655e45" gracePeriod=30 Nov 29 14:52:37 crc kubenswrapper[4907]: I1129 14:52:37.493791 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="sg-core" containerID="cri-o://daf6924a43b068a4d995e86eb12cc5b18490c89319507b46aa1823d782e267fa" gracePeriod=30 Nov 29 14:52:37 crc kubenswrapper[4907]: I1129 14:52:37.999115 4907 generic.go:334] "Generic (PLEG): container finished" podID="25be0daa-85b7-4739-a025-6d41b05442a0" containerID="80965ab6093d55cdf93ca55abf0b6fc88c9882a1933ba5b285640ac92f53747e" exitCode=0 Nov 29 14:52:37 crc kubenswrapper[4907]: I1129 14:52:37.999144 4907 generic.go:334] "Generic (PLEG): container finished" podID="25be0daa-85b7-4739-a025-6d41b05442a0" containerID="daf6924a43b068a4d995e86eb12cc5b18490c89319507b46aa1823d782e267fa" exitCode=2 Nov 29 14:52:37 crc kubenswrapper[4907]: I1129 14:52:37.999854 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25be0daa-85b7-4739-a025-6d41b05442a0","Type":"ContainerDied","Data":"80965ab6093d55cdf93ca55abf0b6fc88c9882a1933ba5b285640ac92f53747e"} Nov 29 14:52:37 crc kubenswrapper[4907]: I1129 14:52:37.999976 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25be0daa-85b7-4739-a025-6d41b05442a0","Type":"ContainerDied","Data":"daf6924a43b068a4d995e86eb12cc5b18490c89319507b46aa1823d782e267fa"} Nov 29 14:52:39 crc kubenswrapper[4907]: I1129 14:52:39.018895 4907 generic.go:334] "Generic (PLEG): container finished" podID="25be0daa-85b7-4739-a025-6d41b05442a0" containerID="3162d14750066bfcc87d5e1d02b99ef99d2b29568bbb386520efdde907655e45" exitCode=0 Nov 29 14:52:39 crc kubenswrapper[4907]: I1129 14:52:39.019604 4907 generic.go:334] "Generic (PLEG): container finished" podID="25be0daa-85b7-4739-a025-6d41b05442a0" containerID="4339968d81a085c4fc53d392127e1c227dcfb7b05ab4de24fe534bdda1d302ce" exitCode=0 Nov 29 14:52:39 crc kubenswrapper[4907]: I1129 14:52:39.018990 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25be0daa-85b7-4739-a025-6d41b05442a0","Type":"ContainerDied","Data":"3162d14750066bfcc87d5e1d02b99ef99d2b29568bbb386520efdde907655e45"} Nov 29 14:52:39 crc kubenswrapper[4907]: I1129 14:52:39.019654 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25be0daa-85b7-4739-a025-6d41b05442a0","Type":"ContainerDied","Data":"4339968d81a085c4fc53d392127e1c227dcfb7b05ab4de24fe534bdda1d302ce"} Nov 29 14:52:39 crc kubenswrapper[4907]: I1129 14:52:39.382260 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:52:39 crc kubenswrapper[4907]: I1129 14:52:39.446978 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-644d94d9d7-tvfbj"] Nov 29 14:52:39 crc kubenswrapper[4907]: I1129 14:52:39.449472 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-644d94d9d7-tvfbj" podUID="df2182cf-1b22-424b-8c39-e44567b07d45" containerName="heat-engine" containerID="cri-o://66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f" gracePeriod=60 Nov 29 14:52:40 crc kubenswrapper[4907]: I1129 14:52:40.042072 4907 generic.go:334] "Generic (PLEG): container finished" podID="eed2c3ab-cd81-4930-853b-f7034be8e804" containerID="121930e5bdcc1dbd3a849f2bea90ea602e1a66433d01584490b2cbd845c35b13" exitCode=0 Nov 29 14:52:40 crc kubenswrapper[4907]: I1129 14:52:40.042144 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5wt4" event={"ID":"eed2c3ab-cd81-4930-853b-f7034be8e804","Type":"ContainerDied","Data":"121930e5bdcc1dbd3a849f2bea90ea602e1a66433d01584490b2cbd845c35b13"} Nov 29 14:52:43 crc kubenswrapper[4907]: E1129 14:52:43.064359 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Nov 29 14:52:43 crc kubenswrapper[4907]: E1129 14:52:43.065834 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Nov 29 14:52:43 crc kubenswrapper[4907]: E1129 14:52:43.067071 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Nov 29 14:52:43 crc kubenswrapper[4907]: E1129 14:52:43.067145 4907 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-644d94d9d7-tvfbj" podUID="df2182cf-1b22-424b-8c39-e44567b07d45" containerName="heat-engine" Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.027229 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.130542 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25be0daa-85b7-4739-a025-6d41b05442a0","Type":"ContainerDied","Data":"34a69ea94c2b125891a6ecede66920bfdbfe72f4b46803b2779adf93595d7a69"} Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.130594 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.130603 4907 scope.go:117] "RemoveContainer" containerID="80965ab6093d55cdf93ca55abf0b6fc88c9882a1933ba5b285640ac92f53747e" Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.183647 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-sg-core-conf-yaml\") pod \"25be0daa-85b7-4739-a025-6d41b05442a0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.183718 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-config-data\") pod \"25be0daa-85b7-4739-a025-6d41b05442a0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.183791 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25be0daa-85b7-4739-a025-6d41b05442a0-run-httpd\") pod \"25be0daa-85b7-4739-a025-6d41b05442a0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.183864 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v9nz\" (UniqueName: \"kubernetes.io/projected/25be0daa-85b7-4739-a025-6d41b05442a0-kube-api-access-7v9nz\") pod \"25be0daa-85b7-4739-a025-6d41b05442a0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.183897 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-scripts\") pod \"25be0daa-85b7-4739-a025-6d41b05442a0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.183938 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25be0daa-85b7-4739-a025-6d41b05442a0-log-httpd\") pod \"25be0daa-85b7-4739-a025-6d41b05442a0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.183952 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-combined-ca-bundle\") pod \"25be0daa-85b7-4739-a025-6d41b05442a0\" (UID: \"25be0daa-85b7-4739-a025-6d41b05442a0\") " Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.184185 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25be0daa-85b7-4739-a025-6d41b05442a0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "25be0daa-85b7-4739-a025-6d41b05442a0" (UID: "25be0daa-85b7-4739-a025-6d41b05442a0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.184599 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25be0daa-85b7-4739-a025-6d41b05442a0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "25be0daa-85b7-4739-a025-6d41b05442a0" (UID: "25be0daa-85b7-4739-a025-6d41b05442a0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.185060 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25be0daa-85b7-4739-a025-6d41b05442a0-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.185086 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25be0daa-85b7-4739-a025-6d41b05442a0-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.866176 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-scripts" (OuterVolumeSpecName: "scripts") pod "25be0daa-85b7-4739-a025-6d41b05442a0" (UID: "25be0daa-85b7-4739-a025-6d41b05442a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.867705 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25be0daa-85b7-4739-a025-6d41b05442a0-kube-api-access-7v9nz" (OuterVolumeSpecName: "kube-api-access-7v9nz") pod "25be0daa-85b7-4739-a025-6d41b05442a0" (UID: "25be0daa-85b7-4739-a025-6d41b05442a0"). InnerVolumeSpecName "kube-api-access-7v9nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.896086 4907 scope.go:117] "RemoveContainer" containerID="daf6924a43b068a4d995e86eb12cc5b18490c89319507b46aa1823d782e267fa" Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.901713 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v9nz\" (UniqueName: \"kubernetes.io/projected/25be0daa-85b7-4739-a025-6d41b05442a0-kube-api-access-7v9nz\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.901746 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:46 crc kubenswrapper[4907]: I1129 14:52:46.928072 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "25be0daa-85b7-4739-a025-6d41b05442a0" (UID: "25be0daa-85b7-4739-a025-6d41b05442a0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.006070 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.073183 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25be0daa-85b7-4739-a025-6d41b05442a0" (UID: "25be0daa-85b7-4739-a025-6d41b05442a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.108423 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.108668 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-config-data" (OuterVolumeSpecName: "config-data") pod "25be0daa-85b7-4739-a025-6d41b05442a0" (UID: "25be0daa-85b7-4739-a025-6d41b05442a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.115550 4907 scope.go:117] "RemoveContainer" containerID="3162d14750066bfcc87d5e1d02b99ef99d2b29568bbb386520efdde907655e45" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.154303 4907 scope.go:117] "RemoveContainer" containerID="4339968d81a085c4fc53d392127e1c227dcfb7b05ab4de24fe534bdda1d302ce" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.176731 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5wt4" event={"ID":"eed2c3ab-cd81-4930-853b-f7034be8e804","Type":"ContainerStarted","Data":"3b6b6800ee0daf8ef8b4d40a149b3b887dfddf869173aa6b5aa1450920315944"} Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.179112 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dj5h6" event={"ID":"99873cd9-5727-4f0c-888d-ed6c6090abc1","Type":"ContainerStarted","Data":"400e32d48bbb34dc2b5c4d40c4e020a9a4145508b813d144fa5a8ba986bbfdcb"} Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.209675 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.209914 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.210005 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25be0daa-85b7-4739-a025-6d41b05442a0-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.213123 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w5wt4" podStartSLOduration=10.297538642 podStartE2EDuration="21.213100266s" podCreationTimestamp="2025-11-29 14:52:26 +0000 UTC" firstStartedPulling="2025-11-29 14:52:34.921163684 +0000 UTC m=+1452.908001336" lastFinishedPulling="2025-11-29 14:52:45.836725308 +0000 UTC m=+1463.823562960" observedRunningTime="2025-11-29 14:52:47.199951251 +0000 UTC m=+1465.186788903" watchObservedRunningTime="2025-11-29 14:52:47.213100266 +0000 UTC m=+1465.199937918" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.229198 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-dj5h6" podStartSLOduration=2.8512701529999998 podStartE2EDuration="22.229183074s" podCreationTimestamp="2025-11-29 14:52:25 +0000 UTC" firstStartedPulling="2025-11-29 14:52:26.463593854 +0000 UTC m=+1444.450431506" lastFinishedPulling="2025-11-29 14:52:45.841506775 +0000 UTC m=+1463.828344427" observedRunningTime="2025-11-29 14:52:47.223755009 +0000 UTC m=+1465.210592661" watchObservedRunningTime="2025-11-29 14:52:47.229183074 +0000 UTC m=+1465.216020726" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.366146 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.380565 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.507356 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:52:47 crc kubenswrapper[4907]: E1129 14:52:47.507853 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="ceilometer-central-agent" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.507870 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="ceilometer-central-agent" Nov 29 14:52:47 crc kubenswrapper[4907]: E1129 14:52:47.507895 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d704079-49d7-40e3-ba60-f8cf9c281e90" containerName="heat-cfnapi" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.507901 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d704079-49d7-40e3-ba60-f8cf9c281e90" containerName="heat-cfnapi" Nov 29 14:52:47 crc kubenswrapper[4907]: E1129 14:52:47.507908 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" containerName="heat-api" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.507914 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" containerName="heat-api" Nov 29 14:52:47 crc kubenswrapper[4907]: E1129 14:52:47.507941 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="proxy-httpd" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.507946 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="proxy-httpd" Nov 29 14:52:47 crc kubenswrapper[4907]: E1129 14:52:47.507961 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="ceilometer-notification-agent" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.507968 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="ceilometer-notification-agent" Nov 29 14:52:47 crc kubenswrapper[4907]: E1129 14:52:47.507979 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="sg-core" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.507985 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="sg-core" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.508185 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d704079-49d7-40e3-ba60-f8cf9c281e90" containerName="heat-cfnapi" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.508197 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" containerName="heat-api" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.508207 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="ceilometer-notification-agent" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.508220 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="sg-core" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.508235 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d704079-49d7-40e3-ba60-f8cf9c281e90" containerName="heat-cfnapi" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.508244 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="ceilometer-central-agent" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.508256 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" containerName="proxy-httpd" Nov 29 14:52:47 crc kubenswrapper[4907]: E1129 14:52:47.508478 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d704079-49d7-40e3-ba60-f8cf9c281e90" containerName="heat-cfnapi" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.508490 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d704079-49d7-40e3-ba60-f8cf9c281e90" containerName="heat-cfnapi" Nov 29 14:52:47 crc kubenswrapper[4907]: E1129 14:52:47.508500 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" containerName="heat-api" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.508508 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" containerName="heat-api" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.508694 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c8f5fc4-e0de-43ce-bc77-5caac2dfcfaa" containerName="heat-api" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.510249 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.516180 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.516263 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.537273 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.619470 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3acb4f3-e011-4052-9169-15e103e3d081-log-httpd\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.619524 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.619584 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.619680 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtbrr\" (UniqueName: \"kubernetes.io/projected/b3acb4f3-e011-4052-9169-15e103e3d081-kube-api-access-qtbrr\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.620128 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3acb4f3-e011-4052-9169-15e103e3d081-run-httpd\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.620174 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-scripts\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.620340 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-config-data\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.722295 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-config-data\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.722353 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3acb4f3-e011-4052-9169-15e103e3d081-log-httpd\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.722379 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.722447 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.722484 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtbrr\" (UniqueName: \"kubernetes.io/projected/b3acb4f3-e011-4052-9169-15e103e3d081-kube-api-access-qtbrr\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.722592 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3acb4f3-e011-4052-9169-15e103e3d081-run-httpd\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.722615 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-scripts\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.722928 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3acb4f3-e011-4052-9169-15e103e3d081-log-httpd\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.723286 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3acb4f3-e011-4052-9169-15e103e3d081-run-httpd\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.726977 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.727349 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.727334 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-config-data\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.727896 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-scripts\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.742042 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtbrr\" (UniqueName: \"kubernetes.io/projected/b3acb4f3-e011-4052-9169-15e103e3d081-kube-api-access-qtbrr\") pod \"ceilometer-0\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " pod="openstack/ceilometer-0" Nov 29 14:52:47 crc kubenswrapper[4907]: I1129 14:52:47.831963 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:52:48 crc kubenswrapper[4907]: I1129 14:52:48.304853 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w5wt4" podUID="eed2c3ab-cd81-4930-853b-f7034be8e804" containerName="registry-server" probeResult="failure" output=< Nov 29 14:52:48 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 14:52:48 crc kubenswrapper[4907]: > Nov 29 14:52:48 crc kubenswrapper[4907]: E1129 14:52:48.398782 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf2182cf_1b22_424b_8c39_e44567b07d45.slice/crio-conmon-66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf2182cf_1b22_424b_8c39_e44567b07d45.slice/crio-66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f.scope\": RecentStats: unable to find data in memory cache]" Nov 29 14:52:48 crc kubenswrapper[4907]: E1129 14:52:48.406016 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf2182cf_1b22_424b_8c39_e44567b07d45.slice/crio-conmon-66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f.scope\": RecentStats: unable to find data in memory cache]" Nov 29 14:52:48 crc kubenswrapper[4907]: I1129 14:52:48.506905 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25be0daa-85b7-4739-a025-6d41b05442a0" path="/var/lib/kubelet/pods/25be0daa-85b7-4739-a025-6d41b05442a0/volumes" Nov 29 14:52:48 crc kubenswrapper[4907]: I1129 14:52:48.508081 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:52:48 crc kubenswrapper[4907]: I1129 14:52:48.804704 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:48 crc kubenswrapper[4907]: I1129 14:52:48.857110 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djvc6\" (UniqueName: \"kubernetes.io/projected/df2182cf-1b22-424b-8c39-e44567b07d45-kube-api-access-djvc6\") pod \"df2182cf-1b22-424b-8c39-e44567b07d45\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " Nov 29 14:52:48 crc kubenswrapper[4907]: I1129 14:52:48.858103 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-config-data\") pod \"df2182cf-1b22-424b-8c39-e44567b07d45\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " Nov 29 14:52:48 crc kubenswrapper[4907]: I1129 14:52:48.858196 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-config-data-custom\") pod \"df2182cf-1b22-424b-8c39-e44567b07d45\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " Nov 29 14:52:48 crc kubenswrapper[4907]: I1129 14:52:48.858617 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-combined-ca-bundle\") pod \"df2182cf-1b22-424b-8c39-e44567b07d45\" (UID: \"df2182cf-1b22-424b-8c39-e44567b07d45\") " Nov 29 14:52:48 crc kubenswrapper[4907]: I1129 14:52:48.864926 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "df2182cf-1b22-424b-8c39-e44567b07d45" (UID: "df2182cf-1b22-424b-8c39-e44567b07d45"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:48 crc kubenswrapper[4907]: I1129 14:52:48.867576 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df2182cf-1b22-424b-8c39-e44567b07d45-kube-api-access-djvc6" (OuterVolumeSpecName: "kube-api-access-djvc6") pod "df2182cf-1b22-424b-8c39-e44567b07d45" (UID: "df2182cf-1b22-424b-8c39-e44567b07d45"). InnerVolumeSpecName "kube-api-access-djvc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:52:48 crc kubenswrapper[4907]: I1129 14:52:48.918238 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df2182cf-1b22-424b-8c39-e44567b07d45" (UID: "df2182cf-1b22-424b-8c39-e44567b07d45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:48 crc kubenswrapper[4907]: I1129 14:52:48.961618 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:48 crc kubenswrapper[4907]: I1129 14:52:48.961649 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:48 crc kubenswrapper[4907]: I1129 14:52:48.961660 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djvc6\" (UniqueName: \"kubernetes.io/projected/df2182cf-1b22-424b-8c39-e44567b07d45-kube-api-access-djvc6\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:48 crc kubenswrapper[4907]: I1129 14:52:48.969255 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-config-data" (OuterVolumeSpecName: "config-data") pod "df2182cf-1b22-424b-8c39-e44567b07d45" (UID: "df2182cf-1b22-424b-8c39-e44567b07d45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:52:49 crc kubenswrapper[4907]: I1129 14:52:49.063432 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df2182cf-1b22-424b-8c39-e44567b07d45-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:52:49 crc kubenswrapper[4907]: I1129 14:52:49.256696 4907 generic.go:334] "Generic (PLEG): container finished" podID="df2182cf-1b22-424b-8c39-e44567b07d45" containerID="66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f" exitCode=0 Nov 29 14:52:49 crc kubenswrapper[4907]: I1129 14:52:49.256745 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-644d94d9d7-tvfbj" event={"ID":"df2182cf-1b22-424b-8c39-e44567b07d45","Type":"ContainerDied","Data":"66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f"} Nov 29 14:52:49 crc kubenswrapper[4907]: I1129 14:52:49.256770 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-644d94d9d7-tvfbj" event={"ID":"df2182cf-1b22-424b-8c39-e44567b07d45","Type":"ContainerDied","Data":"84bcfbd040f2c72fc6d1dad34dc1962763f1e56599004736a2489df9b54521ef"} Nov 29 14:52:49 crc kubenswrapper[4907]: I1129 14:52:49.256787 4907 scope.go:117] "RemoveContainer" containerID="66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f" Nov 29 14:52:49 crc kubenswrapper[4907]: I1129 14:52:49.256886 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-644d94d9d7-tvfbj" Nov 29 14:52:49 crc kubenswrapper[4907]: I1129 14:52:49.258753 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3acb4f3-e011-4052-9169-15e103e3d081","Type":"ContainerStarted","Data":"205cb843125d147f32840cd716c9fef6cbbe1b1b45bb2f52f2cc4a1eb3e5d274"} Nov 29 14:52:49 crc kubenswrapper[4907]: I1129 14:52:49.299933 4907 scope.go:117] "RemoveContainer" containerID="66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f" Nov 29 14:52:49 crc kubenswrapper[4907]: E1129 14:52:49.303516 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f\": container with ID starting with 66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f not found: ID does not exist" containerID="66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f" Nov 29 14:52:49 crc kubenswrapper[4907]: I1129 14:52:49.303626 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f"} err="failed to get container status \"66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f\": rpc error: code = NotFound desc = could not find container \"66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f\": container with ID starting with 66bf86b1e8af30ab59344c6341821fe3b6f09a11edbc8e6b6227b81e409d347f not found: ID does not exist" Nov 29 14:52:49 crc kubenswrapper[4907]: I1129 14:52:49.344507 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-644d94d9d7-tvfbj"] Nov 29 14:52:49 crc kubenswrapper[4907]: I1129 14:52:49.361300 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-644d94d9d7-tvfbj"] Nov 29 14:52:50 crc kubenswrapper[4907]: I1129 14:52:50.284350 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3acb4f3-e011-4052-9169-15e103e3d081","Type":"ContainerStarted","Data":"1c5aab0aa9703d9edb499e9032df3ce5508da4e814fcd00ad4a2f37163a9fb8a"} Nov 29 14:52:50 crc kubenswrapper[4907]: I1129 14:52:50.284800 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3acb4f3-e011-4052-9169-15e103e3d081","Type":"ContainerStarted","Data":"cde2fc5bf1efe2957854742e9d1a21d8bebd73eaf90243596d497bcdc8aea6b9"} Nov 29 14:52:50 crc kubenswrapper[4907]: I1129 14:52:50.495267 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df2182cf-1b22-424b-8c39-e44567b07d45" path="/var/lib/kubelet/pods/df2182cf-1b22-424b-8c39-e44567b07d45/volumes" Nov 29 14:52:51 crc kubenswrapper[4907]: I1129 14:52:51.298423 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3acb4f3-e011-4052-9169-15e103e3d081","Type":"ContainerStarted","Data":"f026afe123deb20e02c134555df9efedb6dfb373aab8305c28f3ba9e35172bf1"} Nov 29 14:52:53 crc kubenswrapper[4907]: I1129 14:52:53.334056 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3acb4f3-e011-4052-9169-15e103e3d081","Type":"ContainerStarted","Data":"da22964331259c0791fa9daaa1b7105553994e9f30f05709dcb2d79bcf462fd9"} Nov 29 14:52:53 crc kubenswrapper[4907]: I1129 14:52:53.335843 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 14:52:53 crc kubenswrapper[4907]: I1129 14:52:53.366563 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.8817184449999997 podStartE2EDuration="6.366548826s" podCreationTimestamp="2025-11-29 14:52:47 +0000 UTC" firstStartedPulling="2025-11-29 14:52:48.502916357 +0000 UTC m=+1466.489754009" lastFinishedPulling="2025-11-29 14:52:51.987746738 +0000 UTC m=+1469.974584390" observedRunningTime="2025-11-29 14:52:53.366088183 +0000 UTC m=+1471.352925835" watchObservedRunningTime="2025-11-29 14:52:53.366548826 +0000 UTC m=+1471.353386478" Nov 29 14:52:54 crc kubenswrapper[4907]: I1129 14:52:54.651223 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:52:56 crc kubenswrapper[4907]: I1129 14:52:56.380705 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="ceilometer-central-agent" containerID="cri-o://cde2fc5bf1efe2957854742e9d1a21d8bebd73eaf90243596d497bcdc8aea6b9" gracePeriod=30 Nov 29 14:52:56 crc kubenswrapper[4907]: I1129 14:52:56.380788 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="sg-core" containerID="cri-o://f026afe123deb20e02c134555df9efedb6dfb373aab8305c28f3ba9e35172bf1" gracePeriod=30 Nov 29 14:52:56 crc kubenswrapper[4907]: I1129 14:52:56.380802 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="ceilometer-notification-agent" containerID="cri-o://1c5aab0aa9703d9edb499e9032df3ce5508da4e814fcd00ad4a2f37163a9fb8a" gracePeriod=30 Nov 29 14:52:56 crc kubenswrapper[4907]: I1129 14:52:56.380833 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="proxy-httpd" containerID="cri-o://da22964331259c0791fa9daaa1b7105553994e9f30f05709dcb2d79bcf462fd9" gracePeriod=30 Nov 29 14:52:57 crc kubenswrapper[4907]: I1129 14:52:57.393806 4907 generic.go:334] "Generic (PLEG): container finished" podID="b3acb4f3-e011-4052-9169-15e103e3d081" containerID="da22964331259c0791fa9daaa1b7105553994e9f30f05709dcb2d79bcf462fd9" exitCode=0 Nov 29 14:52:57 crc kubenswrapper[4907]: I1129 14:52:57.394098 4907 generic.go:334] "Generic (PLEG): container finished" podID="b3acb4f3-e011-4052-9169-15e103e3d081" containerID="f026afe123deb20e02c134555df9efedb6dfb373aab8305c28f3ba9e35172bf1" exitCode=2 Nov 29 14:52:57 crc kubenswrapper[4907]: I1129 14:52:57.394108 4907 generic.go:334] "Generic (PLEG): container finished" podID="b3acb4f3-e011-4052-9169-15e103e3d081" containerID="1c5aab0aa9703d9edb499e9032df3ce5508da4e814fcd00ad4a2f37163a9fb8a" exitCode=0 Nov 29 14:52:57 crc kubenswrapper[4907]: I1129 14:52:57.393866 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3acb4f3-e011-4052-9169-15e103e3d081","Type":"ContainerDied","Data":"da22964331259c0791fa9daaa1b7105553994e9f30f05709dcb2d79bcf462fd9"} Nov 29 14:52:57 crc kubenswrapper[4907]: I1129 14:52:57.394143 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3acb4f3-e011-4052-9169-15e103e3d081","Type":"ContainerDied","Data":"f026afe123deb20e02c134555df9efedb6dfb373aab8305c28f3ba9e35172bf1"} Nov 29 14:52:57 crc kubenswrapper[4907]: I1129 14:52:57.394158 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3acb4f3-e011-4052-9169-15e103e3d081","Type":"ContainerDied","Data":"1c5aab0aa9703d9edb499e9032df3ce5508da4e814fcd00ad4a2f37163a9fb8a"} Nov 29 14:52:58 crc kubenswrapper[4907]: I1129 14:52:58.272030 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w5wt4" podUID="eed2c3ab-cd81-4930-853b-f7034be8e804" containerName="registry-server" probeResult="failure" output=< Nov 29 14:52:58 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 14:52:58 crc kubenswrapper[4907]: > Nov 29 14:53:00 crc kubenswrapper[4907]: I1129 14:53:00.881602 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 14:53:00 crc kubenswrapper[4907]: I1129 14:53:00.882734 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b638c730-93e5-475d-afd6-b1c83c3e4952" containerName="glance-httpd" containerID="cri-o://055477b9a1e16f2b062a2b014f3463bf1912746d19f695a05a3cdf41f9d3f1bf" gracePeriod=30 Nov 29 14:53:00 crc kubenswrapper[4907]: I1129 14:53:00.885772 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b638c730-93e5-475d-afd6-b1c83c3e4952" containerName="glance-log" containerID="cri-o://a06af6f54d1a144582ea59ec2783f88120c05fdbd4d7f559c2afea2ad660f6db" gracePeriod=30 Nov 29 14:53:01 crc kubenswrapper[4907]: I1129 14:53:01.439000 4907 generic.go:334] "Generic (PLEG): container finished" podID="b638c730-93e5-475d-afd6-b1c83c3e4952" containerID="a06af6f54d1a144582ea59ec2783f88120c05fdbd4d7f559c2afea2ad660f6db" exitCode=143 Nov 29 14:53:01 crc kubenswrapper[4907]: I1129 14:53:01.439087 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b638c730-93e5-475d-afd6-b1c83c3e4952","Type":"ContainerDied","Data":"a06af6f54d1a144582ea59ec2783f88120c05fdbd4d7f559c2afea2ad660f6db"} Nov 29 14:53:01 crc kubenswrapper[4907]: I1129 14:53:01.442413 4907 generic.go:334] "Generic (PLEG): container finished" podID="99873cd9-5727-4f0c-888d-ed6c6090abc1" containerID="400e32d48bbb34dc2b5c4d40c4e020a9a4145508b813d144fa5a8ba986bbfdcb" exitCode=0 Nov 29 14:53:01 crc kubenswrapper[4907]: I1129 14:53:01.442493 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dj5h6" event={"ID":"99873cd9-5727-4f0c-888d-ed6c6090abc1","Type":"ContainerDied","Data":"400e32d48bbb34dc2b5c4d40c4e020a9a4145508b813d144fa5a8ba986bbfdcb"} Nov 29 14:53:01 crc kubenswrapper[4907]: I1129 14:53:01.988409 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 14:53:01 crc kubenswrapper[4907]: I1129 14:53:01.989028 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7b49cb25-4ca7-4382-b378-749bf7081894" containerName="glance-log" containerID="cri-o://fae5b500fdd97fec81f970e4aa5152fe8be097c18c5be82a016be44e262b1425" gracePeriod=30 Nov 29 14:53:01 crc kubenswrapper[4907]: I1129 14:53:01.989192 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7b49cb25-4ca7-4382-b378-749bf7081894" containerName="glance-httpd" containerID="cri-o://2df3e3d52990970ce9e869c42b9bcf5b24cd86a53fa741f9a60d8153a701cbd4" gracePeriod=30 Nov 29 14:53:02 crc kubenswrapper[4907]: I1129 14:53:02.510995 4907 generic.go:334] "Generic (PLEG): container finished" podID="7b49cb25-4ca7-4382-b378-749bf7081894" containerID="fae5b500fdd97fec81f970e4aa5152fe8be097c18c5be82a016be44e262b1425" exitCode=143 Nov 29 14:53:02 crc kubenswrapper[4907]: I1129 14:53:02.567355 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b49cb25-4ca7-4382-b378-749bf7081894","Type":"ContainerDied","Data":"fae5b500fdd97fec81f970e4aa5152fe8be097c18c5be82a016be44e262b1425"} Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.007169 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.194006 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-config-data\") pod \"99873cd9-5727-4f0c-888d-ed6c6090abc1\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.194083 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-scripts\") pod \"99873cd9-5727-4f0c-888d-ed6c6090abc1\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.194146 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-combined-ca-bundle\") pod \"99873cd9-5727-4f0c-888d-ed6c6090abc1\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.194269 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj7zc\" (UniqueName: \"kubernetes.io/projected/99873cd9-5727-4f0c-888d-ed6c6090abc1-kube-api-access-vj7zc\") pod \"99873cd9-5727-4f0c-888d-ed6c6090abc1\" (UID: \"99873cd9-5727-4f0c-888d-ed6c6090abc1\") " Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.201102 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-scripts" (OuterVolumeSpecName: "scripts") pod "99873cd9-5727-4f0c-888d-ed6c6090abc1" (UID: "99873cd9-5727-4f0c-888d-ed6c6090abc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.201678 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99873cd9-5727-4f0c-888d-ed6c6090abc1-kube-api-access-vj7zc" (OuterVolumeSpecName: "kube-api-access-vj7zc") pod "99873cd9-5727-4f0c-888d-ed6c6090abc1" (UID: "99873cd9-5727-4f0c-888d-ed6c6090abc1"). InnerVolumeSpecName "kube-api-access-vj7zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.230593 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99873cd9-5727-4f0c-888d-ed6c6090abc1" (UID: "99873cd9-5727-4f0c-888d-ed6c6090abc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.278169 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-config-data" (OuterVolumeSpecName: "config-data") pod "99873cd9-5727-4f0c-888d-ed6c6090abc1" (UID: "99873cd9-5727-4f0c-888d-ed6c6090abc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.300771 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.300841 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.300858 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj7zc\" (UniqueName: \"kubernetes.io/projected/99873cd9-5727-4f0c-888d-ed6c6090abc1-kube-api-access-vj7zc\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.300868 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99873cd9-5727-4f0c-888d-ed6c6090abc1-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.525571 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dj5h6" event={"ID":"99873cd9-5727-4f0c-888d-ed6c6090abc1","Type":"ContainerDied","Data":"286efb2c7c3b7d529d554995b2d73f48f1e13ca1e90f98dd25fb71eeec88419f"} Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.525614 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="286efb2c7c3b7d529d554995b2d73f48f1e13ca1e90f98dd25fb71eeec88419f" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.525698 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dj5h6" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.571472 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 14:53:03 crc kubenswrapper[4907]: E1129 14:53:03.571977 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df2182cf-1b22-424b-8c39-e44567b07d45" containerName="heat-engine" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.571999 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="df2182cf-1b22-424b-8c39-e44567b07d45" containerName="heat-engine" Nov 29 14:53:03 crc kubenswrapper[4907]: E1129 14:53:03.572043 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99873cd9-5727-4f0c-888d-ed6c6090abc1" containerName="nova-cell0-conductor-db-sync" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.572052 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="99873cd9-5727-4f0c-888d-ed6c6090abc1" containerName="nova-cell0-conductor-db-sync" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.572342 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="99873cd9-5727-4f0c-888d-ed6c6090abc1" containerName="nova-cell0-conductor-db-sync" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.572378 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="df2182cf-1b22-424b-8c39-e44567b07d45" containerName="heat-engine" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.574612 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.578716 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-lmwcv" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.578992 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.583598 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.610868 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21e4e8e-a729-4641-9c99-c022eb3ca6a8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a21e4e8e-a729-4641-9c99-c022eb3ca6a8\") " pod="openstack/nova-cell0-conductor-0" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.611324 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21e4e8e-a729-4641-9c99-c022eb3ca6a8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a21e4e8e-a729-4641-9c99-c022eb3ca6a8\") " pod="openstack/nova-cell0-conductor-0" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.611528 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw4lk\" (UniqueName: \"kubernetes.io/projected/a21e4e8e-a729-4641-9c99-c022eb3ca6a8-kube-api-access-gw4lk\") pod \"nova-cell0-conductor-0\" (UID: \"a21e4e8e-a729-4641-9c99-c022eb3ca6a8\") " pod="openstack/nova-cell0-conductor-0" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.714455 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21e4e8e-a729-4641-9c99-c022eb3ca6a8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a21e4e8e-a729-4641-9c99-c022eb3ca6a8\") " pod="openstack/nova-cell0-conductor-0" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.714615 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw4lk\" (UniqueName: \"kubernetes.io/projected/a21e4e8e-a729-4641-9c99-c022eb3ca6a8-kube-api-access-gw4lk\") pod \"nova-cell0-conductor-0\" (UID: \"a21e4e8e-a729-4641-9c99-c022eb3ca6a8\") " pod="openstack/nova-cell0-conductor-0" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.714791 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21e4e8e-a729-4641-9c99-c022eb3ca6a8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a21e4e8e-a729-4641-9c99-c022eb3ca6a8\") " pod="openstack/nova-cell0-conductor-0" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.719581 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21e4e8e-a729-4641-9c99-c022eb3ca6a8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a21e4e8e-a729-4641-9c99-c022eb3ca6a8\") " pod="openstack/nova-cell0-conductor-0" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.720029 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21e4e8e-a729-4641-9c99-c022eb3ca6a8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a21e4e8e-a729-4641-9c99-c022eb3ca6a8\") " pod="openstack/nova-cell0-conductor-0" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.730177 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw4lk\" (UniqueName: \"kubernetes.io/projected/a21e4e8e-a729-4641-9c99-c022eb3ca6a8-kube-api-access-gw4lk\") pod \"nova-cell0-conductor-0\" (UID: \"a21e4e8e-a729-4641-9c99-c022eb3ca6a8\") " pod="openstack/nova-cell0-conductor-0" Nov 29 14:53:03 crc kubenswrapper[4907]: I1129 14:53:03.902995 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.403246 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.544329 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a21e4e8e-a729-4641-9c99-c022eb3ca6a8","Type":"ContainerStarted","Data":"a624d463ca2a63ec77b7ef2bc433cb04b8d663beadaf867e8c5fbd76b7a975dd"} Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.547153 4907 generic.go:334] "Generic (PLEG): container finished" podID="b638c730-93e5-475d-afd6-b1c83c3e4952" containerID="055477b9a1e16f2b062a2b014f3463bf1912746d19f695a05a3cdf41f9d3f1bf" exitCode=0 Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.547181 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b638c730-93e5-475d-afd6-b1c83c3e4952","Type":"ContainerDied","Data":"055477b9a1e16f2b062a2b014f3463bf1912746d19f695a05a3cdf41f9d3f1bf"} Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.642655 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.842785 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"b638c730-93e5-475d-afd6-b1c83c3e4952\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.842847 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-scripts\") pod \"b638c730-93e5-475d-afd6-b1c83c3e4952\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.842907 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-config-data\") pod \"b638c730-93e5-475d-afd6-b1c83c3e4952\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.842933 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b638c730-93e5-475d-afd6-b1c83c3e4952-logs\") pod \"b638c730-93e5-475d-afd6-b1c83c3e4952\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.842970 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-public-tls-certs\") pod \"b638c730-93e5-475d-afd6-b1c83c3e4952\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.843061 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b638c730-93e5-475d-afd6-b1c83c3e4952-httpd-run\") pod \"b638c730-93e5-475d-afd6-b1c83c3e4952\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.843093 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b49lt\" (UniqueName: \"kubernetes.io/projected/b638c730-93e5-475d-afd6-b1c83c3e4952-kube-api-access-b49lt\") pod \"b638c730-93e5-475d-afd6-b1c83c3e4952\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.843163 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-combined-ca-bundle\") pod \"b638c730-93e5-475d-afd6-b1c83c3e4952\" (UID: \"b638c730-93e5-475d-afd6-b1c83c3e4952\") " Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.843371 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b638c730-93e5-475d-afd6-b1c83c3e4952-logs" (OuterVolumeSpecName: "logs") pod "b638c730-93e5-475d-afd6-b1c83c3e4952" (UID: "b638c730-93e5-475d-afd6-b1c83c3e4952"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.843492 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b638c730-93e5-475d-afd6-b1c83c3e4952-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b638c730-93e5-475d-afd6-b1c83c3e4952" (UID: "b638c730-93e5-475d-afd6-b1c83c3e4952"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.843926 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b638c730-93e5-475d-afd6-b1c83c3e4952-logs\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.843942 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b638c730-93e5-475d-afd6-b1c83c3e4952-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.847094 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "b638c730-93e5-475d-afd6-b1c83c3e4952" (UID: "b638c730-93e5-475d-afd6-b1c83c3e4952"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.851683 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-scripts" (OuterVolumeSpecName: "scripts") pod "b638c730-93e5-475d-afd6-b1c83c3e4952" (UID: "b638c730-93e5-475d-afd6-b1c83c3e4952"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.853556 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b638c730-93e5-475d-afd6-b1c83c3e4952-kube-api-access-b49lt" (OuterVolumeSpecName: "kube-api-access-b49lt") pod "b638c730-93e5-475d-afd6-b1c83c3e4952" (UID: "b638c730-93e5-475d-afd6-b1c83c3e4952"). InnerVolumeSpecName "kube-api-access-b49lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.898489 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b638c730-93e5-475d-afd6-b1c83c3e4952" (UID: "b638c730-93e5-475d-afd6-b1c83c3e4952"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.907752 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-config-data" (OuterVolumeSpecName: "config-data") pod "b638c730-93e5-475d-afd6-b1c83c3e4952" (UID: "b638c730-93e5-475d-afd6-b1c83c3e4952"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.927405 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b638c730-93e5-475d-afd6-b1c83c3e4952" (UID: "b638c730-93e5-475d-afd6-b1c83c3e4952"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.946397 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.946466 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.946482 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.946494 4907 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.946506 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b49lt\" (UniqueName: \"kubernetes.io/projected/b638c730-93e5-475d-afd6-b1c83c3e4952-kube-api-access-b49lt\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.946517 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b638c730-93e5-475d-afd6-b1c83c3e4952-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:04 crc kubenswrapper[4907]: I1129 14:53:04.975119 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.049405 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.382885 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="7b49cb25-4ca7-4382-b378-749bf7081894" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.190:9292/healthcheck\": dial tcp 10.217.0.190:9292: connect: connection refused" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.382982 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="7b49cb25-4ca7-4382-b378-749bf7081894" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.190:9292/healthcheck\": dial tcp 10.217.0.190:9292: connect: connection refused" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.560110 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b638c730-93e5-475d-afd6-b1c83c3e4952","Type":"ContainerDied","Data":"38cb7a61de30e6dfa404f4c379ff5f86300d90e1f3627139dc23b53d425918cc"} Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.560163 4907 scope.go:117] "RemoveContainer" containerID="055477b9a1e16f2b062a2b014f3463bf1912746d19f695a05a3cdf41f9d3f1bf" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.560134 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.563107 4907 generic.go:334] "Generic (PLEG): container finished" podID="7b49cb25-4ca7-4382-b378-749bf7081894" containerID="2df3e3d52990970ce9e869c42b9bcf5b24cd86a53fa741f9a60d8153a701cbd4" exitCode=0 Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.563153 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b49cb25-4ca7-4382-b378-749bf7081894","Type":"ContainerDied","Data":"2df3e3d52990970ce9e869c42b9bcf5b24cd86a53fa741f9a60d8153a701cbd4"} Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.565776 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a21e4e8e-a729-4641-9c99-c022eb3ca6a8","Type":"ContainerStarted","Data":"4743008f5ba9c5060f74c7021d38eadc89cf439d30a20430d401f16caccceb47"} Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.566595 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.582959 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.582936125 podStartE2EDuration="2.582936125s" podCreationTimestamp="2025-11-29 14:53:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:53:05.581562305 +0000 UTC m=+1483.568399957" watchObservedRunningTime="2025-11-29 14:53:05.582936125 +0000 UTC m=+1483.569773777" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.691630 4907 scope.go:117] "RemoveContainer" containerID="a06af6f54d1a144582ea59ec2783f88120c05fdbd4d7f559c2afea2ad660f6db" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.695051 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.709522 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.719445 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 14:53:05 crc kubenswrapper[4907]: E1129 14:53:05.721219 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b638c730-93e5-475d-afd6-b1c83c3e4952" containerName="glance-httpd" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.721246 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b638c730-93e5-475d-afd6-b1c83c3e4952" containerName="glance-httpd" Nov 29 14:53:05 crc kubenswrapper[4907]: E1129 14:53:05.721313 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b638c730-93e5-475d-afd6-b1c83c3e4952" containerName="glance-log" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.721321 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b638c730-93e5-475d-afd6-b1c83c3e4952" containerName="glance-log" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.721642 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b638c730-93e5-475d-afd6-b1c83c3e4952" containerName="glance-httpd" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.721671 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b638c730-93e5-475d-afd6-b1c83c3e4952" containerName="glance-log" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.730137 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.730868 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.734836 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.735008 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.895213 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/163933a3-d98e-4701-9124-c821395572eb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.895281 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/163933a3-d98e-4701-9124-c821395572eb-scripts\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.895311 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmj8j\" (UniqueName: \"kubernetes.io/projected/163933a3-d98e-4701-9124-c821395572eb-kube-api-access-kmj8j\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.895366 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/163933a3-d98e-4701-9124-c821395572eb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.895388 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/163933a3-d98e-4701-9124-c821395572eb-logs\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.895407 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/163933a3-d98e-4701-9124-c821395572eb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.895446 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/163933a3-d98e-4701-9124-c821395572eb-config-data\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.895498 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.998639 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/163933a3-d98e-4701-9124-c821395572eb-scripts\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.998697 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmj8j\" (UniqueName: \"kubernetes.io/projected/163933a3-d98e-4701-9124-c821395572eb-kube-api-access-kmj8j\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.998790 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/163933a3-d98e-4701-9124-c821395572eb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.998822 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/163933a3-d98e-4701-9124-c821395572eb-logs\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.998843 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/163933a3-d98e-4701-9124-c821395572eb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.998876 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/163933a3-d98e-4701-9124-c821395572eb-config-data\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.998926 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.999046 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/163933a3-d98e-4701-9124-c821395572eb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.999351 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.999397 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/163933a3-d98e-4701-9124-c821395572eb-logs\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:05 crc kubenswrapper[4907]: I1129 14:53:05.999517 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/163933a3-d98e-4701-9124-c821395572eb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.004385 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/163933a3-d98e-4701-9124-c821395572eb-scripts\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.008848 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/163933a3-d98e-4701-9124-c821395572eb-config-data\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.010914 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/163933a3-d98e-4701-9124-c821395572eb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.012935 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/163933a3-d98e-4701-9124-c821395572eb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.013507 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmj8j\" (UniqueName: \"kubernetes.io/projected/163933a3-d98e-4701-9124-c821395572eb-kube-api-access-kmj8j\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.025415 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.032788 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"163933a3-d98e-4701-9124-c821395572eb\") " pod="openstack/glance-default-external-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.067235 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.078784 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.100805 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-config-data\") pod \"b3acb4f3-e011-4052-9169-15e103e3d081\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.100887 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"7b49cb25-4ca7-4382-b378-749bf7081894\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.100912 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3acb4f3-e011-4052-9169-15e103e3d081-run-httpd\") pod \"b3acb4f3-e011-4052-9169-15e103e3d081\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.100949 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-internal-tls-certs\") pod \"7b49cb25-4ca7-4382-b378-749bf7081894\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.100989 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-config-data\") pod \"7b49cb25-4ca7-4382-b378-749bf7081894\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.101023 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-sg-core-conf-yaml\") pod \"b3acb4f3-e011-4052-9169-15e103e3d081\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.101099 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b49cb25-4ca7-4382-b378-749bf7081894-logs\") pod \"7b49cb25-4ca7-4382-b378-749bf7081894\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.101153 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz6bp\" (UniqueName: \"kubernetes.io/projected/7b49cb25-4ca7-4382-b378-749bf7081894-kube-api-access-nz6bp\") pod \"7b49cb25-4ca7-4382-b378-749bf7081894\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.101199 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-combined-ca-bundle\") pod \"7b49cb25-4ca7-4382-b378-749bf7081894\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.101223 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-scripts\") pod \"b3acb4f3-e011-4052-9169-15e103e3d081\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.101255 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b49cb25-4ca7-4382-b378-749bf7081894-httpd-run\") pod \"7b49cb25-4ca7-4382-b378-749bf7081894\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.101367 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-scripts\") pod \"7b49cb25-4ca7-4382-b378-749bf7081894\" (UID: \"7b49cb25-4ca7-4382-b378-749bf7081894\") " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.101406 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtbrr\" (UniqueName: \"kubernetes.io/projected/b3acb4f3-e011-4052-9169-15e103e3d081-kube-api-access-qtbrr\") pod \"b3acb4f3-e011-4052-9169-15e103e3d081\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.101431 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-combined-ca-bundle\") pod \"b3acb4f3-e011-4052-9169-15e103e3d081\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.101476 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3acb4f3-e011-4052-9169-15e103e3d081-log-httpd\") pod \"b3acb4f3-e011-4052-9169-15e103e3d081\" (UID: \"b3acb4f3-e011-4052-9169-15e103e3d081\") " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.102093 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3acb4f3-e011-4052-9169-15e103e3d081-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b3acb4f3-e011-4052-9169-15e103e3d081" (UID: "b3acb4f3-e011-4052-9169-15e103e3d081"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.102702 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3acb4f3-e011-4052-9169-15e103e3d081-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.103846 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b49cb25-4ca7-4382-b378-749bf7081894-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7b49cb25-4ca7-4382-b378-749bf7081894" (UID: "7b49cb25-4ca7-4382-b378-749bf7081894"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.104096 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b49cb25-4ca7-4382-b378-749bf7081894-logs" (OuterVolumeSpecName: "logs") pod "7b49cb25-4ca7-4382-b378-749bf7081894" (UID: "7b49cb25-4ca7-4382-b378-749bf7081894"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.105922 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3acb4f3-e011-4052-9169-15e103e3d081-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b3acb4f3-e011-4052-9169-15e103e3d081" (UID: "b3acb4f3-e011-4052-9169-15e103e3d081"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.107936 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "7b49cb25-4ca7-4382-b378-749bf7081894" (UID: "7b49cb25-4ca7-4382-b378-749bf7081894"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.115286 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3acb4f3-e011-4052-9169-15e103e3d081-kube-api-access-qtbrr" (OuterVolumeSpecName: "kube-api-access-qtbrr") pod "b3acb4f3-e011-4052-9169-15e103e3d081" (UID: "b3acb4f3-e011-4052-9169-15e103e3d081"). InnerVolumeSpecName "kube-api-access-qtbrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.117942 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-scripts" (OuterVolumeSpecName: "scripts") pod "b3acb4f3-e011-4052-9169-15e103e3d081" (UID: "b3acb4f3-e011-4052-9169-15e103e3d081"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.157369 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-scripts" (OuterVolumeSpecName: "scripts") pod "7b49cb25-4ca7-4382-b378-749bf7081894" (UID: "7b49cb25-4ca7-4382-b378-749bf7081894"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.157559 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b49cb25-4ca7-4382-b378-749bf7081894-kube-api-access-nz6bp" (OuterVolumeSpecName: "kube-api-access-nz6bp") pod "7b49cb25-4ca7-4382-b378-749bf7081894" (UID: "7b49cb25-4ca7-4382-b378-749bf7081894"). InnerVolumeSpecName "kube-api-access-nz6bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.216143 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b49cb25-4ca7-4382-b378-749bf7081894-logs\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.216179 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz6bp\" (UniqueName: \"kubernetes.io/projected/7b49cb25-4ca7-4382-b378-749bf7081894-kube-api-access-nz6bp\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.216192 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.216200 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b49cb25-4ca7-4382-b378-749bf7081894-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.216209 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.216218 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtbrr\" (UniqueName: \"kubernetes.io/projected/b3acb4f3-e011-4052-9169-15e103e3d081-kube-api-access-qtbrr\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.216226 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3acb4f3-e011-4052-9169-15e103e3d081-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.216253 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.220576 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b3acb4f3-e011-4052-9169-15e103e3d081" (UID: "b3acb4f3-e011-4052-9169-15e103e3d081"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.301914 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-config-data" (OuterVolumeSpecName: "config-data") pod "7b49cb25-4ca7-4382-b378-749bf7081894" (UID: "7b49cb25-4ca7-4382-b378-749bf7081894"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.313296 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.317667 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.317692 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.317701 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.349000 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3acb4f3-e011-4052-9169-15e103e3d081" (UID: "b3acb4f3-e011-4052-9169-15e103e3d081"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.390593 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b49cb25-4ca7-4382-b378-749bf7081894" (UID: "7b49cb25-4ca7-4382-b378-749bf7081894"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.421030 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.423659 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.440272 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-config-data" (OuterVolumeSpecName: "config-data") pod "b3acb4f3-e011-4052-9169-15e103e3d081" (UID: "b3acb4f3-e011-4052-9169-15e103e3d081"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.449229 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7b49cb25-4ca7-4382-b378-749bf7081894" (UID: "7b49cb25-4ca7-4382-b378-749bf7081894"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.495753 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b638c730-93e5-475d-afd6-b1c83c3e4952" path="/var/lib/kubelet/pods/b638c730-93e5-475d-afd6-b1c83c3e4952/volumes" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.527665 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3acb4f3-e011-4052-9169-15e103e3d081-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.528227 4907 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b49cb25-4ca7-4382-b378-749bf7081894-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.580833 4907 generic.go:334] "Generic (PLEG): container finished" podID="b3acb4f3-e011-4052-9169-15e103e3d081" containerID="cde2fc5bf1efe2957854742e9d1a21d8bebd73eaf90243596d497bcdc8aea6b9" exitCode=0 Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.580882 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3acb4f3-e011-4052-9169-15e103e3d081","Type":"ContainerDied","Data":"cde2fc5bf1efe2957854742e9d1a21d8bebd73eaf90243596d497bcdc8aea6b9"} Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.580923 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3acb4f3-e011-4052-9169-15e103e3d081","Type":"ContainerDied","Data":"205cb843125d147f32840cd716c9fef6cbbe1b1b45bb2f52f2cc4a1eb3e5d274"} Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.580951 4907 scope.go:117] "RemoveContainer" containerID="da22964331259c0791fa9daaa1b7105553994e9f30f05709dcb2d79bcf462fd9" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.582154 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.586014 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.586469 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b49cb25-4ca7-4382-b378-749bf7081894","Type":"ContainerDied","Data":"c97cfc8b0c751b4b13ab7193896660bf44f991eef419d5bab7d33cf41f6cac48"} Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.618338 4907 scope.go:117] "RemoveContainer" containerID="f026afe123deb20e02c134555df9efedb6dfb373aab8305c28f3ba9e35172bf1" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.621681 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.638919 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.639091 4907 scope.go:117] "RemoveContainer" containerID="1c5aab0aa9703d9edb499e9032df3ce5508da4e814fcd00ad4a2f37163a9fb8a" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.654839 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 14:53:06 crc kubenswrapper[4907]: E1129 14:53:06.655366 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="proxy-httpd" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.655378 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="proxy-httpd" Nov 29 14:53:06 crc kubenswrapper[4907]: E1129 14:53:06.655391 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b49cb25-4ca7-4382-b378-749bf7081894" containerName="glance-httpd" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.655397 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b49cb25-4ca7-4382-b378-749bf7081894" containerName="glance-httpd" Nov 29 14:53:06 crc kubenswrapper[4907]: E1129 14:53:06.655424 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b49cb25-4ca7-4382-b378-749bf7081894" containerName="glance-log" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.655453 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b49cb25-4ca7-4382-b378-749bf7081894" containerName="glance-log" Nov 29 14:53:06 crc kubenswrapper[4907]: E1129 14:53:06.655469 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="ceilometer-central-agent" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.655477 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="ceilometer-central-agent" Nov 29 14:53:06 crc kubenswrapper[4907]: E1129 14:53:06.655506 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="ceilometer-notification-agent" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.655512 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="ceilometer-notification-agent" Nov 29 14:53:06 crc kubenswrapper[4907]: E1129 14:53:06.655522 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="sg-core" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.655528 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="sg-core" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.655711 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b49cb25-4ca7-4382-b378-749bf7081894" containerName="glance-httpd" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.655723 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b49cb25-4ca7-4382-b378-749bf7081894" containerName="glance-log" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.655741 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="sg-core" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.655752 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="ceilometer-notification-agent" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.655764 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="ceilometer-central-agent" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.655774 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" containerName="proxy-httpd" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.657282 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.669576 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.670027 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.675359 4907 scope.go:117] "RemoveContainer" containerID="cde2fc5bf1efe2957854742e9d1a21d8bebd73eaf90243596d497bcdc8aea6b9" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.689507 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.713165 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.727248 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.729033 4907 scope.go:117] "RemoveContainer" containerID="da22964331259c0791fa9daaa1b7105553994e9f30f05709dcb2d79bcf462fd9" Nov 29 14:53:06 crc kubenswrapper[4907]: E1129 14:53:06.737636 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da22964331259c0791fa9daaa1b7105553994e9f30f05709dcb2d79bcf462fd9\": container with ID starting with da22964331259c0791fa9daaa1b7105553994e9f30f05709dcb2d79bcf462fd9 not found: ID does not exist" containerID="da22964331259c0791fa9daaa1b7105553994e9f30f05709dcb2d79bcf462fd9" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.737714 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da22964331259c0791fa9daaa1b7105553994e9f30f05709dcb2d79bcf462fd9"} err="failed to get container status \"da22964331259c0791fa9daaa1b7105553994e9f30f05709dcb2d79bcf462fd9\": rpc error: code = NotFound desc = could not find container \"da22964331259c0791fa9daaa1b7105553994e9f30f05709dcb2d79bcf462fd9\": container with ID starting with da22964331259c0791fa9daaa1b7105553994e9f30f05709dcb2d79bcf462fd9 not found: ID does not exist" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.737743 4907 scope.go:117] "RemoveContainer" containerID="f026afe123deb20e02c134555df9efedb6dfb373aab8305c28f3ba9e35172bf1" Nov 29 14:53:06 crc kubenswrapper[4907]: E1129 14:53:06.742891 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f026afe123deb20e02c134555df9efedb6dfb373aab8305c28f3ba9e35172bf1\": container with ID starting with f026afe123deb20e02c134555df9efedb6dfb373aab8305c28f3ba9e35172bf1 not found: ID does not exist" containerID="f026afe123deb20e02c134555df9efedb6dfb373aab8305c28f3ba9e35172bf1" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.742952 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f026afe123deb20e02c134555df9efedb6dfb373aab8305c28f3ba9e35172bf1"} err="failed to get container status \"f026afe123deb20e02c134555df9efedb6dfb373aab8305c28f3ba9e35172bf1\": rpc error: code = NotFound desc = could not find container \"f026afe123deb20e02c134555df9efedb6dfb373aab8305c28f3ba9e35172bf1\": container with ID starting with f026afe123deb20e02c134555df9efedb6dfb373aab8305c28f3ba9e35172bf1 not found: ID does not exist" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.742987 4907 scope.go:117] "RemoveContainer" containerID="1c5aab0aa9703d9edb499e9032df3ce5508da4e814fcd00ad4a2f37163a9fb8a" Nov 29 14:53:06 crc kubenswrapper[4907]: E1129 14:53:06.743523 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c5aab0aa9703d9edb499e9032df3ce5508da4e814fcd00ad4a2f37163a9fb8a\": container with ID starting with 1c5aab0aa9703d9edb499e9032df3ce5508da4e814fcd00ad4a2f37163a9fb8a not found: ID does not exist" containerID="1c5aab0aa9703d9edb499e9032df3ce5508da4e814fcd00ad4a2f37163a9fb8a" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.743560 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c5aab0aa9703d9edb499e9032df3ce5508da4e814fcd00ad4a2f37163a9fb8a"} err="failed to get container status \"1c5aab0aa9703d9edb499e9032df3ce5508da4e814fcd00ad4a2f37163a9fb8a\": rpc error: code = NotFound desc = could not find container \"1c5aab0aa9703d9edb499e9032df3ce5508da4e814fcd00ad4a2f37163a9fb8a\": container with ID starting with 1c5aab0aa9703d9edb499e9032df3ce5508da4e814fcd00ad4a2f37163a9fb8a not found: ID does not exist" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.743580 4907 scope.go:117] "RemoveContainer" containerID="cde2fc5bf1efe2957854742e9d1a21d8bebd73eaf90243596d497bcdc8aea6b9" Nov 29 14:53:06 crc kubenswrapper[4907]: E1129 14:53:06.744995 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde2fc5bf1efe2957854742e9d1a21d8bebd73eaf90243596d497bcdc8aea6b9\": container with ID starting with cde2fc5bf1efe2957854742e9d1a21d8bebd73eaf90243596d497bcdc8aea6b9 not found: ID does not exist" containerID="cde2fc5bf1efe2957854742e9d1a21d8bebd73eaf90243596d497bcdc8aea6b9" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.745027 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde2fc5bf1efe2957854742e9d1a21d8bebd73eaf90243596d497bcdc8aea6b9"} err="failed to get container status \"cde2fc5bf1efe2957854742e9d1a21d8bebd73eaf90243596d497bcdc8aea6b9\": rpc error: code = NotFound desc = could not find container \"cde2fc5bf1efe2957854742e9d1a21d8bebd73eaf90243596d497bcdc8aea6b9\": container with ID starting with cde2fc5bf1efe2957854742e9d1a21d8bebd73eaf90243596d497bcdc8aea6b9 not found: ID does not exist" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.745046 4907 scope.go:117] "RemoveContainer" containerID="2df3e3d52990970ce9e869c42b9bcf5b24cd86a53fa741f9a60d8153a701cbd4" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.759417 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0759e100-595c-4f28-8934-25f0a3bb9010-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.759941 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0759e100-595c-4f28-8934-25f0a3bb9010-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.760058 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.760201 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0759e100-595c-4f28-8934-25f0a3bb9010-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.760385 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h2kg\" (UniqueName: \"kubernetes.io/projected/0759e100-595c-4f28-8934-25f0a3bb9010-kube-api-access-2h2kg\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.760556 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0759e100-595c-4f28-8934-25f0a3bb9010-logs\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.760710 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0759e100-595c-4f28-8934-25f0a3bb9010-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.760888 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0759e100-595c-4f28-8934-25f0a3bb9010-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.780233 4907 scope.go:117] "RemoveContainer" containerID="fae5b500fdd97fec81f970e4aa5152fe8be097c18c5be82a016be44e262b1425" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.803899 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.812590 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.816234 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.816361 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.817768 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.854988 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.867107 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.867761 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b791d6-67df-4388-838e-b44193576abb-run-httpd\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: W1129 14:53:06.868080 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod163933a3_d98e_4701_9124_c821395572eb.slice/crio-4c5bc8b6a5e0faedfdd13668bdb86cd119a0d3c4c559cd65c711ffb6e6bac7c0 WatchSource:0}: Error finding container 4c5bc8b6a5e0faedfdd13668bdb86cd119a0d3c4c559cd65c711ffb6e6bac7c0: Status 404 returned error can't find the container with id 4c5bc8b6a5e0faedfdd13668bdb86cd119a0d3c4c559cd65c711ffb6e6bac7c0 Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.868314 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-scripts\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.868404 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0759e100-595c-4f28-8934-25f0a3bb9010-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.868500 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.868596 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0759e100-595c-4f28-8934-25f0a3bb9010-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.868691 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b791d6-67df-4388-838e-b44193576abb-log-httpd\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.868781 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h2kg\" (UniqueName: \"kubernetes.io/projected/0759e100-595c-4f28-8934-25f0a3bb9010-kube-api-access-2h2kg\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.868867 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0759e100-595c-4f28-8934-25f0a3bb9010-logs\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.868950 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0759e100-595c-4f28-8934-25f0a3bb9010-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.869020 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-config-data\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.870254 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0759e100-595c-4f28-8934-25f0a3bb9010-logs\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.870866 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0759e100-595c-4f28-8934-25f0a3bb9010-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.870970 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0759e100-595c-4f28-8934-25f0a3bb9010-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.871028 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.871243 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm7jd\" (UniqueName: \"kubernetes.io/projected/89b791d6-67df-4388-838e-b44193576abb-kube-api-access-dm7jd\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.871336 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.871412 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0759e100-595c-4f28-8934-25f0a3bb9010-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.875426 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0759e100-595c-4f28-8934-25f0a3bb9010-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.876169 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0759e100-595c-4f28-8934-25f0a3bb9010-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.879013 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0759e100-595c-4f28-8934-25f0a3bb9010-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.885219 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0759e100-595c-4f28-8934-25f0a3bb9010-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.886484 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h2kg\" (UniqueName: \"kubernetes.io/projected/0759e100-595c-4f28-8934-25f0a3bb9010-kube-api-access-2h2kg\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.908814 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0759e100-595c-4f28-8934-25f0a3bb9010\") " pod="openstack/glance-default-internal-api-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.973202 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.973267 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.973308 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b791d6-67df-4388-838e-b44193576abb-run-httpd\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.973374 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-scripts\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.973418 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b791d6-67df-4388-838e-b44193576abb-log-httpd\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.973672 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-config-data\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.973760 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm7jd\" (UniqueName: \"kubernetes.io/projected/89b791d6-67df-4388-838e-b44193576abb-kube-api-access-dm7jd\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.974123 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b791d6-67df-4388-838e-b44193576abb-log-httpd\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.974224 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b791d6-67df-4388-838e-b44193576abb-run-httpd\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.977388 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-scripts\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.977422 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-config-data\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.979052 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.979878 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:06 crc kubenswrapper[4907]: I1129 14:53:06.988719 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm7jd\" (UniqueName: \"kubernetes.io/projected/89b791d6-67df-4388-838e-b44193576abb-kube-api-access-dm7jd\") pod \"ceilometer-0\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " pod="openstack/ceilometer-0" Nov 29 14:53:07 crc kubenswrapper[4907]: I1129 14:53:07.008465 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 14:53:07 crc kubenswrapper[4907]: I1129 14:53:07.162042 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:53:07 crc kubenswrapper[4907]: I1129 14:53:07.303207 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:53:07 crc kubenswrapper[4907]: I1129 14:53:07.354334 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:53:07 crc kubenswrapper[4907]: I1129 14:53:07.536662 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5wt4"] Nov 29 14:53:07 crc kubenswrapper[4907]: W1129 14:53:07.573688 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0759e100_595c_4f28_8934_25f0a3bb9010.slice/crio-1cd3acb1fab4a61d0765bc4296b44ab0af155dfb8c74aa629f9df231f2cb634a WatchSource:0}: Error finding container 1cd3acb1fab4a61d0765bc4296b44ab0af155dfb8c74aa629f9df231f2cb634a: Status 404 returned error can't find the container with id 1cd3acb1fab4a61d0765bc4296b44ab0af155dfb8c74aa629f9df231f2cb634a Nov 29 14:53:07 crc kubenswrapper[4907]: I1129 14:53:07.611749 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"163933a3-d98e-4701-9124-c821395572eb","Type":"ContainerStarted","Data":"2fb19fc73e2c61aecf3aed79b063cc5fc506480f750e5a5b0936c9ab7e715acd"} Nov 29 14:53:07 crc kubenswrapper[4907]: I1129 14:53:07.611793 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"163933a3-d98e-4701-9124-c821395572eb","Type":"ContainerStarted","Data":"4c5bc8b6a5e0faedfdd13668bdb86cd119a0d3c4c559cd65c711ffb6e6bac7c0"} Nov 29 14:53:07 crc kubenswrapper[4907]: I1129 14:53:07.612461 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 14:53:07 crc kubenswrapper[4907]: I1129 14:53:07.616362 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0759e100-595c-4f28-8934-25f0a3bb9010","Type":"ContainerStarted","Data":"1cd3acb1fab4a61d0765bc4296b44ab0af155dfb8c74aa629f9df231f2cb634a"} Nov 29 14:53:07 crc kubenswrapper[4907]: I1129 14:53:07.810758 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:08 crc kubenswrapper[4907]: I1129 14:53:08.494830 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b49cb25-4ca7-4382-b378-749bf7081894" path="/var/lib/kubelet/pods/7b49cb25-4ca7-4382-b378-749bf7081894/volumes" Nov 29 14:53:08 crc kubenswrapper[4907]: I1129 14:53:08.496335 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3acb4f3-e011-4052-9169-15e103e3d081" path="/var/lib/kubelet/pods/b3acb4f3-e011-4052-9169-15e103e3d081/volumes" Nov 29 14:53:08 crc kubenswrapper[4907]: I1129 14:53:08.632620 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"163933a3-d98e-4701-9124-c821395572eb","Type":"ContainerStarted","Data":"0818a34aa278e73044ceed3eb95dede3d1513ec137a3a47d77b8505ecbc45739"} Nov 29 14:53:08 crc kubenswrapper[4907]: I1129 14:53:08.635675 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0759e100-595c-4f28-8934-25f0a3bb9010","Type":"ContainerStarted","Data":"ef1a87c018ff6ac57e86e8e9f74f2935d0a61724d108670aa967ebf3a09aa379"} Nov 29 14:53:08 crc kubenswrapper[4907]: I1129 14:53:08.636896 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w5wt4" podUID="eed2c3ab-cd81-4930-853b-f7034be8e804" containerName="registry-server" containerID="cri-o://3b6b6800ee0daf8ef8b4d40a149b3b887dfddf869173aa6b5aa1450920315944" gracePeriod=2 Nov 29 14:53:08 crc kubenswrapper[4907]: I1129 14:53:08.637137 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b791d6-67df-4388-838e-b44193576abb","Type":"ContainerStarted","Data":"55b749439abab66eccca01fb9e5d481ff7ee57d544b7bb0cd4e1d46cc6569ac8"} Nov 29 14:53:08 crc kubenswrapper[4907]: I1129 14:53:08.637166 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b791d6-67df-4388-838e-b44193576abb","Type":"ContainerStarted","Data":"11950eee35ac91975bf9d57f4978a276deede60fe21986ecc322488bec0210e2"} Nov 29 14:53:08 crc kubenswrapper[4907]: I1129 14:53:08.663488 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.663469553 podStartE2EDuration="3.663469553s" podCreationTimestamp="2025-11-29 14:53:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:53:08.657983647 +0000 UTC m=+1486.644821299" watchObservedRunningTime="2025-11-29 14:53:08.663469553 +0000 UTC m=+1486.650307205" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.150114 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.222477 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eed2c3ab-cd81-4930-853b-f7034be8e804-utilities\") pod \"eed2c3ab-cd81-4930-853b-f7034be8e804\" (UID: \"eed2c3ab-cd81-4930-853b-f7034be8e804\") " Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.222553 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trw2p\" (UniqueName: \"kubernetes.io/projected/eed2c3ab-cd81-4930-853b-f7034be8e804-kube-api-access-trw2p\") pod \"eed2c3ab-cd81-4930-853b-f7034be8e804\" (UID: \"eed2c3ab-cd81-4930-853b-f7034be8e804\") " Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.222730 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eed2c3ab-cd81-4930-853b-f7034be8e804-catalog-content\") pod \"eed2c3ab-cd81-4930-853b-f7034be8e804\" (UID: \"eed2c3ab-cd81-4930-853b-f7034be8e804\") " Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.223861 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eed2c3ab-cd81-4930-853b-f7034be8e804-utilities" (OuterVolumeSpecName: "utilities") pod "eed2c3ab-cd81-4930-853b-f7034be8e804" (UID: "eed2c3ab-cd81-4930-853b-f7034be8e804"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.235584 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eed2c3ab-cd81-4930-853b-f7034be8e804-kube-api-access-trw2p" (OuterVolumeSpecName: "kube-api-access-trw2p") pod "eed2c3ab-cd81-4930-853b-f7034be8e804" (UID: "eed2c3ab-cd81-4930-853b-f7034be8e804"). InnerVolumeSpecName "kube-api-access-trw2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.324763 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eed2c3ab-cd81-4930-853b-f7034be8e804-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.324789 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trw2p\" (UniqueName: \"kubernetes.io/projected/eed2c3ab-cd81-4930-853b-f7034be8e804-kube-api-access-trw2p\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.326944 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eed2c3ab-cd81-4930-853b-f7034be8e804-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eed2c3ab-cd81-4930-853b-f7034be8e804" (UID: "eed2c3ab-cd81-4930-853b-f7034be8e804"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.427722 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eed2c3ab-cd81-4930-853b-f7034be8e804-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.650585 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0759e100-595c-4f28-8934-25f0a3bb9010","Type":"ContainerStarted","Data":"dddb3f0c64cd20fb5302488d35083ef05e3404fee73a40b6304d875d8a027661"} Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.653706 4907 generic.go:334] "Generic (PLEG): container finished" podID="eed2c3ab-cd81-4930-853b-f7034be8e804" containerID="3b6b6800ee0daf8ef8b4d40a149b3b887dfddf869173aa6b5aa1450920315944" exitCode=0 Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.653802 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5wt4" event={"ID":"eed2c3ab-cd81-4930-853b-f7034be8e804","Type":"ContainerDied","Data":"3b6b6800ee0daf8ef8b4d40a149b3b887dfddf869173aa6b5aa1450920315944"} Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.653747 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5wt4" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.653857 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5wt4" event={"ID":"eed2c3ab-cd81-4930-853b-f7034be8e804","Type":"ContainerDied","Data":"e97c859a1732c989fb2dd1cd8464e21a870994c92c11ae17a32e9bdedf0a163f"} Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.653870 4907 scope.go:117] "RemoveContainer" containerID="3b6b6800ee0daf8ef8b4d40a149b3b887dfddf869173aa6b5aa1450920315944" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.656127 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b791d6-67df-4388-838e-b44193576abb","Type":"ContainerStarted","Data":"c77fa2e5f782596bb383bc5d2f9e1d6a0b8efb6220c4a71b37bc1070c9eafe4d"} Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.679810 4907 scope.go:117] "RemoveContainer" containerID="121930e5bdcc1dbd3a849f2bea90ea602e1a66433d01584490b2cbd845c35b13" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.693935 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.693919412 podStartE2EDuration="3.693919412s" podCreationTimestamp="2025-11-29 14:53:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:53:09.683835965 +0000 UTC m=+1487.670673607" watchObservedRunningTime="2025-11-29 14:53:09.693919412 +0000 UTC m=+1487.680757054" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.718505 4907 scope.go:117] "RemoveContainer" containerID="e649d0a0548f44ffecfd6e098939697a671ae7bfe310536e15f493889c12b536" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.719279 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5wt4"] Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.730554 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w5wt4"] Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.761753 4907 scope.go:117] "RemoveContainer" containerID="3b6b6800ee0daf8ef8b4d40a149b3b887dfddf869173aa6b5aa1450920315944" Nov 29 14:53:09 crc kubenswrapper[4907]: E1129 14:53:09.762212 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b6b6800ee0daf8ef8b4d40a149b3b887dfddf869173aa6b5aa1450920315944\": container with ID starting with 3b6b6800ee0daf8ef8b4d40a149b3b887dfddf869173aa6b5aa1450920315944 not found: ID does not exist" containerID="3b6b6800ee0daf8ef8b4d40a149b3b887dfddf869173aa6b5aa1450920315944" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.762243 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b6b6800ee0daf8ef8b4d40a149b3b887dfddf869173aa6b5aa1450920315944"} err="failed to get container status \"3b6b6800ee0daf8ef8b4d40a149b3b887dfddf869173aa6b5aa1450920315944\": rpc error: code = NotFound desc = could not find container \"3b6b6800ee0daf8ef8b4d40a149b3b887dfddf869173aa6b5aa1450920315944\": container with ID starting with 3b6b6800ee0daf8ef8b4d40a149b3b887dfddf869173aa6b5aa1450920315944 not found: ID does not exist" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.762263 4907 scope.go:117] "RemoveContainer" containerID="121930e5bdcc1dbd3a849f2bea90ea602e1a66433d01584490b2cbd845c35b13" Nov 29 14:53:09 crc kubenswrapper[4907]: E1129 14:53:09.762668 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"121930e5bdcc1dbd3a849f2bea90ea602e1a66433d01584490b2cbd845c35b13\": container with ID starting with 121930e5bdcc1dbd3a849f2bea90ea602e1a66433d01584490b2cbd845c35b13 not found: ID does not exist" containerID="121930e5bdcc1dbd3a849f2bea90ea602e1a66433d01584490b2cbd845c35b13" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.762689 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"121930e5bdcc1dbd3a849f2bea90ea602e1a66433d01584490b2cbd845c35b13"} err="failed to get container status \"121930e5bdcc1dbd3a849f2bea90ea602e1a66433d01584490b2cbd845c35b13\": rpc error: code = NotFound desc = could not find container \"121930e5bdcc1dbd3a849f2bea90ea602e1a66433d01584490b2cbd845c35b13\": container with ID starting with 121930e5bdcc1dbd3a849f2bea90ea602e1a66433d01584490b2cbd845c35b13 not found: ID does not exist" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.762702 4907 scope.go:117] "RemoveContainer" containerID="e649d0a0548f44ffecfd6e098939697a671ae7bfe310536e15f493889c12b536" Nov 29 14:53:09 crc kubenswrapper[4907]: E1129 14:53:09.762941 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e649d0a0548f44ffecfd6e098939697a671ae7bfe310536e15f493889c12b536\": container with ID starting with e649d0a0548f44ffecfd6e098939697a671ae7bfe310536e15f493889c12b536 not found: ID does not exist" containerID="e649d0a0548f44ffecfd6e098939697a671ae7bfe310536e15f493889c12b536" Nov 29 14:53:09 crc kubenswrapper[4907]: I1129 14:53:09.763016 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e649d0a0548f44ffecfd6e098939697a671ae7bfe310536e15f493889c12b536"} err="failed to get container status \"e649d0a0548f44ffecfd6e098939697a671ae7bfe310536e15f493889c12b536\": rpc error: code = NotFound desc = could not find container \"e649d0a0548f44ffecfd6e098939697a671ae7bfe310536e15f493889c12b536\": container with ID starting with e649d0a0548f44ffecfd6e098939697a671ae7bfe310536e15f493889c12b536 not found: ID does not exist" Nov 29 14:53:10 crc kubenswrapper[4907]: I1129 14:53:10.492571 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eed2c3ab-cd81-4930-853b-f7034be8e804" path="/var/lib/kubelet/pods/eed2c3ab-cd81-4930-853b-f7034be8e804/volumes" Nov 29 14:53:10 crc kubenswrapper[4907]: I1129 14:53:10.683214 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b791d6-67df-4388-838e-b44193576abb","Type":"ContainerStarted","Data":"5227d6b5314d1986abdb30810b15794976478f895706a8760045e282db5df768"} Nov 29 14:53:11 crc kubenswrapper[4907]: I1129 14:53:11.730598 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b791d6-67df-4388-838e-b44193576abb","Type":"ContainerStarted","Data":"2b04e48baffd5019ca9f956d06da60efb60b877c9276c7e296f41838605857b3"} Nov 29 14:53:11 crc kubenswrapper[4907]: I1129 14:53:11.730909 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 14:53:11 crc kubenswrapper[4907]: I1129 14:53:11.752076 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.259610599 podStartE2EDuration="5.752059201s" podCreationTimestamp="2025-11-29 14:53:06 +0000 UTC" firstStartedPulling="2025-11-29 14:53:07.833695214 +0000 UTC m=+1485.820532866" lastFinishedPulling="2025-11-29 14:53:11.326143816 +0000 UTC m=+1489.312981468" observedRunningTime="2025-11-29 14:53:11.745561425 +0000 UTC m=+1489.732399077" watchObservedRunningTime="2025-11-29 14:53:11.752059201 +0000 UTC m=+1489.738896853" Nov 29 14:53:13 crc kubenswrapper[4907]: I1129 14:53:13.960213 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.659417 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-gghpv"] Nov 29 14:53:14 crc kubenswrapper[4907]: E1129 14:53:14.660114 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eed2c3ab-cd81-4930-853b-f7034be8e804" containerName="extract-utilities" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.660126 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed2c3ab-cd81-4930-853b-f7034be8e804" containerName="extract-utilities" Nov 29 14:53:14 crc kubenswrapper[4907]: E1129 14:53:14.660166 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eed2c3ab-cd81-4930-853b-f7034be8e804" containerName="registry-server" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.660174 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed2c3ab-cd81-4930-853b-f7034be8e804" containerName="registry-server" Nov 29 14:53:14 crc kubenswrapper[4907]: E1129 14:53:14.660184 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eed2c3ab-cd81-4930-853b-f7034be8e804" containerName="extract-content" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.660190 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed2c3ab-cd81-4930-853b-f7034be8e804" containerName="extract-content" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.660394 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="eed2c3ab-cd81-4930-853b-f7034be8e804" containerName="registry-server" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.661127 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.663999 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.665127 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.686963 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2vkt\" (UniqueName: \"kubernetes.io/projected/9701e823-f11a-4ce6-85de-80f705092f11-kube-api-access-l2vkt\") pod \"nova-cell0-cell-mapping-gghpv\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.687061 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-config-data\") pod \"nova-cell0-cell-mapping-gghpv\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.687109 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gghpv\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.687176 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-scripts\") pod \"nova-cell0-cell-mapping-gghpv\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.693105 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gghpv"] Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.801322 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gghpv\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.801718 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-scripts\") pod \"nova-cell0-cell-mapping-gghpv\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.802088 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2vkt\" (UniqueName: \"kubernetes.io/projected/9701e823-f11a-4ce6-85de-80f705092f11-kube-api-access-l2vkt\") pod \"nova-cell0-cell-mapping-gghpv\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.802301 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-config-data\") pod \"nova-cell0-cell-mapping-gghpv\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.834538 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-config-data\") pod \"nova-cell0-cell-mapping-gghpv\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.839873 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gghpv\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.841087 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-scripts\") pod \"nova-cell0-cell-mapping-gghpv\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:14 crc kubenswrapper[4907]: I1129 14:53:14.913114 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2vkt\" (UniqueName: \"kubernetes.io/projected/9701e823-f11a-4ce6-85de-80f705092f11-kube-api-access-l2vkt\") pod \"nova-cell0-cell-mapping-gghpv\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.013695 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.015222 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.033861 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.034121 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.040167 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.091934 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.093769 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.118223 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcrgk\" (UniqueName: \"kubernetes.io/projected/2a663d98-2309-4412-a832-12e75ad4098a-kube-api-access-xcrgk\") pod \"nova-api-0\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " pod="openstack/nova-api-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.118290 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-config-data\") pod \"nova-scheduler-0\" (UID: \"4d7d806d-53d1-4bdb-8469-0ecd96a9896f\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.118385 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a663d98-2309-4412-a832-12e75ad4098a-config-data\") pod \"nova-api-0\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " pod="openstack/nova-api-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.118415 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4d7d806d-53d1-4bdb-8469-0ecd96a9896f\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.118460 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a663d98-2309-4412-a832-12e75ad4098a-logs\") pod \"nova-api-0\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " pod="openstack/nova-api-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.118497 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn8qh\" (UniqueName: \"kubernetes.io/projected/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-kube-api-access-qn8qh\") pod \"nova-scheduler-0\" (UID: \"4d7d806d-53d1-4bdb-8469-0ecd96a9896f\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.118567 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a663d98-2309-4412-a832-12e75ad4098a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " pod="openstack/nova-api-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.148314 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.196495 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.202457 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.205654 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.223377 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38bf2698-d87c-4bea-87ae-35b657433ba5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38bf2698-d87c-4bea-87ae-35b657433ba5\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.223424 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a663d98-2309-4412-a832-12e75ad4098a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " pod="openstack/nova-api-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.223480 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcrgk\" (UniqueName: \"kubernetes.io/projected/2a663d98-2309-4412-a832-12e75ad4098a-kube-api-access-xcrgk\") pod \"nova-api-0\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " pod="openstack/nova-api-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.223517 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-config-data\") pod \"nova-scheduler-0\" (UID: \"4d7d806d-53d1-4bdb-8469-0ecd96a9896f\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.223597 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a663d98-2309-4412-a832-12e75ad4098a-config-data\") pod \"nova-api-0\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " pod="openstack/nova-api-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.223621 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38bf2698-d87c-4bea-87ae-35b657433ba5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38bf2698-d87c-4bea-87ae-35b657433ba5\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.223642 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4d7d806d-53d1-4bdb-8469-0ecd96a9896f\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.223669 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a663d98-2309-4412-a832-12e75ad4098a-logs\") pod \"nova-api-0\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " pod="openstack/nova-api-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.223702 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn8qh\" (UniqueName: \"kubernetes.io/projected/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-kube-api-access-qn8qh\") pod \"nova-scheduler-0\" (UID: \"4d7d806d-53d1-4bdb-8469-0ecd96a9896f\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.223752 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zgfr\" (UniqueName: \"kubernetes.io/projected/38bf2698-d87c-4bea-87ae-35b657433ba5-kube-api-access-4zgfr\") pod \"nova-cell1-novncproxy-0\" (UID: \"38bf2698-d87c-4bea-87ae-35b657433ba5\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.235130 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-config-data\") pod \"nova-scheduler-0\" (UID: \"4d7d806d-53d1-4bdb-8469-0ecd96a9896f\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.235237 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a663d98-2309-4412-a832-12e75ad4098a-config-data\") pod \"nova-api-0\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " pod="openstack/nova-api-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.235967 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a663d98-2309-4412-a832-12e75ad4098a-logs\") pod \"nova-api-0\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " pod="openstack/nova-api-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.245190 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4d7d806d-53d1-4bdb-8469-0ecd96a9896f\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.235250 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a663d98-2309-4412-a832-12e75ad4098a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " pod="openstack/nova-api-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.248754 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.308482 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn8qh\" (UniqueName: \"kubernetes.io/projected/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-kube-api-access-qn8qh\") pod \"nova-scheduler-0\" (UID: \"4d7d806d-53d1-4bdb-8469-0ecd96a9896f\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.319105 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcrgk\" (UniqueName: \"kubernetes.io/projected/2a663d98-2309-4412-a832-12e75ad4098a-kube-api-access-xcrgk\") pod \"nova-api-0\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " pod="openstack/nova-api-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.319980 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.336110 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38bf2698-d87c-4bea-87ae-35b657433ba5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38bf2698-d87c-4bea-87ae-35b657433ba5\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.336266 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zgfr\" (UniqueName: \"kubernetes.io/projected/38bf2698-d87c-4bea-87ae-35b657433ba5-kube-api-access-4zgfr\") pod \"nova-cell1-novncproxy-0\" (UID: \"38bf2698-d87c-4bea-87ae-35b657433ba5\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.336294 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38bf2698-d87c-4bea-87ae-35b657433ba5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38bf2698-d87c-4bea-87ae-35b657433ba5\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.349001 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38bf2698-d87c-4bea-87ae-35b657433ba5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38bf2698-d87c-4bea-87ae-35b657433ba5\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.349305 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38bf2698-d87c-4bea-87ae-35b657433ba5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38bf2698-d87c-4bea-87ae-35b657433ba5\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.356638 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.359918 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.364204 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.373261 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zgfr\" (UniqueName: \"kubernetes.io/projected/38bf2698-d87c-4bea-87ae-35b657433ba5-kube-api-access-4zgfr\") pod \"nova-cell1-novncproxy-0\" (UID: \"38bf2698-d87c-4bea-87ae-35b657433ba5\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.376815 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.405248 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.422757 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-vtdtq"] Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.443369 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-vtdtq"] Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.443511 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.446274 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7a92d55-663e-463d-9ce9-60526260e213-logs\") pod \"nova-metadata-0\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " pod="openstack/nova-metadata-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.446337 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a92d55-663e-463d-9ce9-60526260e213-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " pod="openstack/nova-metadata-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.446489 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g59z6\" (UniqueName: \"kubernetes.io/projected/d7a92d55-663e-463d-9ce9-60526260e213-kube-api-access-g59z6\") pod \"nova-metadata-0\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " pod="openstack/nova-metadata-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.446567 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a92d55-663e-463d-9ce9-60526260e213-config-data\") pod \"nova-metadata-0\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " pod="openstack/nova-metadata-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.549562 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a92d55-663e-463d-9ce9-60526260e213-config-data\") pod \"nova-metadata-0\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " pod="openstack/nova-metadata-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.549973 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-config\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.550041 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9lk7\" (UniqueName: \"kubernetes.io/projected/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-kube-api-access-k9lk7\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.550125 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-dns-svc\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.550148 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.550280 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7a92d55-663e-463d-9ce9-60526260e213-logs\") pod \"nova-metadata-0\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " pod="openstack/nova-metadata-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.550320 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.550373 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a92d55-663e-463d-9ce9-60526260e213-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " pod="openstack/nova-metadata-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.550398 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.550552 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g59z6\" (UniqueName: \"kubernetes.io/projected/d7a92d55-663e-463d-9ce9-60526260e213-kube-api-access-g59z6\") pod \"nova-metadata-0\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " pod="openstack/nova-metadata-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.556062 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a92d55-663e-463d-9ce9-60526260e213-config-data\") pod \"nova-metadata-0\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " pod="openstack/nova-metadata-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.556233 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7a92d55-663e-463d-9ce9-60526260e213-logs\") pod \"nova-metadata-0\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " pod="openstack/nova-metadata-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.557018 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a92d55-663e-463d-9ce9-60526260e213-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " pod="openstack/nova-metadata-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.561992 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.570575 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g59z6\" (UniqueName: \"kubernetes.io/projected/d7a92d55-663e-463d-9ce9-60526260e213-kube-api-access-g59z6\") pod \"nova-metadata-0\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " pod="openstack/nova-metadata-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.621483 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.660401 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-config\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.660788 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9lk7\" (UniqueName: \"kubernetes.io/projected/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-kube-api-access-k9lk7\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.662141 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-dns-svc\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.662907 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.663474 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-dns-svc\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.664497 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-config\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.667851 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.668578 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.668674 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.672565 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.672615 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.676363 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9lk7\" (UniqueName: \"kubernetes.io/projected/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-kube-api-access-k9lk7\") pod \"dnsmasq-dns-9b86998b5-vtdtq\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.723146 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.816790 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gghpv"] Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.821992 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:15 crc kubenswrapper[4907]: I1129 14:53:15.920422 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gghpv" event={"ID":"9701e823-f11a-4ce6-85de-80f705092f11","Type":"ContainerStarted","Data":"ad4ed4e48b64a119d69b78aae10f4c1ac3210e467904fa8161b6887d894ae3e8"} Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.067635 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.068589 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.104270 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.133623 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.159193 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.311499 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.471399 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.508634 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.555259 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4v7qp"] Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.557043 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.560373 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.563576 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.594120 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-scripts\") pod \"nova-cell1-conductor-db-sync-4v7qp\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.594243 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b9sm\" (UniqueName: \"kubernetes.io/projected/424d2988-9e7a-460f-b621-8268a86daaa5-kube-api-access-7b9sm\") pod \"nova-cell1-conductor-db-sync-4v7qp\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.594382 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-config-data\") pod \"nova-cell1-conductor-db-sync-4v7qp\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.594418 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4v7qp\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.622041 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4v7qp"] Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.673752 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-vtdtq"] Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.696993 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-config-data\") pod \"nova-cell1-conductor-db-sync-4v7qp\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.697050 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4v7qp\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.697173 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-scripts\") pod \"nova-cell1-conductor-db-sync-4v7qp\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.697224 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b9sm\" (UniqueName: \"kubernetes.io/projected/424d2988-9e7a-460f-b621-8268a86daaa5-kube-api-access-7b9sm\") pod \"nova-cell1-conductor-db-sync-4v7qp\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.708171 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-config-data\") pod \"nova-cell1-conductor-db-sync-4v7qp\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.712222 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4v7qp\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.712474 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-scripts\") pod \"nova-cell1-conductor-db-sync-4v7qp\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.719347 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b9sm\" (UniqueName: \"kubernetes.io/projected/424d2988-9e7a-460f-b621-8268a86daaa5-kube-api-access-7b9sm\") pod \"nova-cell1-conductor-db-sync-4v7qp\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.917162 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.941706 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" event={"ID":"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3","Type":"ContainerStarted","Data":"bb0dca8b765198896fe43bcba8ea0eb32f8be1c8334aaf1cd72607d3311ee860"} Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.941776 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" event={"ID":"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3","Type":"ContainerStarted","Data":"62a37cf7298b01083ef1722572a7cc37fd183010caa16a44338c05255cc6610c"} Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.958565 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38bf2698-d87c-4bea-87ae-35b657433ba5","Type":"ContainerStarted","Data":"fb3f552dca6ac1e9ba3df86381ee11157aaa74c2183236b56afe8b205d8446a9"} Nov 29 14:53:16 crc kubenswrapper[4907]: I1129 14:53:16.992851 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gghpv" event={"ID":"9701e823-f11a-4ce6-85de-80f705092f11","Type":"ContainerStarted","Data":"92b054cfe0a423f39cbc1c63cedaf00c15b521d76ff5f25c7b86233d09047ffb"} Nov 29 14:53:17 crc kubenswrapper[4907]: I1129 14:53:17.008677 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 14:53:17 crc kubenswrapper[4907]: I1129 14:53:17.009426 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 14:53:17 crc kubenswrapper[4907]: I1129 14:53:17.028367 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4d7d806d-53d1-4bdb-8469-0ecd96a9896f","Type":"ContainerStarted","Data":"a603c19c018b7d1b842354fdc73cd370d679bb24a62986d40b1ca8b64d3266f0"} Nov 29 14:53:17 crc kubenswrapper[4907]: I1129 14:53:17.031634 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-gghpv" podStartSLOduration=3.031613936 podStartE2EDuration="3.031613936s" podCreationTimestamp="2025-11-29 14:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:53:17.024715622 +0000 UTC m=+1495.011553274" watchObservedRunningTime="2025-11-29 14:53:17.031613936 +0000 UTC m=+1495.018451588" Nov 29 14:53:17 crc kubenswrapper[4907]: I1129 14:53:17.039978 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a663d98-2309-4412-a832-12e75ad4098a","Type":"ContainerStarted","Data":"6ad25f5eb25a071841decfd44028b4321de66b466706877fac102ddff8679c41"} Nov 29 14:53:17 crc kubenswrapper[4907]: I1129 14:53:17.048530 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7a92d55-663e-463d-9ce9-60526260e213","Type":"ContainerStarted","Data":"d8c267a2b34043a1ddfea199b501d647dde5dd2658092448a6de4705e7342e24"} Nov 29 14:53:17 crc kubenswrapper[4907]: I1129 14:53:17.048672 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 14:53:17 crc kubenswrapper[4907]: I1129 14:53:17.048689 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 14:53:17 crc kubenswrapper[4907]: I1129 14:53:17.069676 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 14:53:17 crc kubenswrapper[4907]: I1129 14:53:17.088540 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 14:53:17 crc kubenswrapper[4907]: I1129 14:53:17.636592 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4v7qp"] Nov 29 14:53:18 crc kubenswrapper[4907]: I1129 14:53:18.082098 4907 generic.go:334] "Generic (PLEG): container finished" podID="dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3" containerID="bb0dca8b765198896fe43bcba8ea0eb32f8be1c8334aaf1cd72607d3311ee860" exitCode=0 Nov 29 14:53:18 crc kubenswrapper[4907]: I1129 14:53:18.083050 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" event={"ID":"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3","Type":"ContainerDied","Data":"bb0dca8b765198896fe43bcba8ea0eb32f8be1c8334aaf1cd72607d3311ee860"} Nov 29 14:53:18 crc kubenswrapper[4907]: I1129 14:53:18.083099 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" event={"ID":"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3","Type":"ContainerStarted","Data":"42554551cad1d9876fe163a761d3b2631971379fdde2e0cfff1a61e784378ca7"} Nov 29 14:53:18 crc kubenswrapper[4907]: I1129 14:53:18.083127 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:18 crc kubenswrapper[4907]: I1129 14:53:18.086648 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4v7qp" event={"ID":"424d2988-9e7a-460f-b621-8268a86daaa5","Type":"ContainerStarted","Data":"a995595350ea220e8aa8a9afa6d4b11dee58a300d1d1f04382834b607f861927"} Nov 29 14:53:18 crc kubenswrapper[4907]: I1129 14:53:18.087495 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 14:53:18 crc kubenswrapper[4907]: I1129 14:53:18.087680 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 14:53:18 crc kubenswrapper[4907]: I1129 14:53:18.117629 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" podStartSLOduration=3.117612146 podStartE2EDuration="3.117612146s" podCreationTimestamp="2025-11-29 14:53:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:53:18.103794026 +0000 UTC m=+1496.090631678" watchObservedRunningTime="2025-11-29 14:53:18.117612146 +0000 UTC m=+1496.104449798" Nov 29 14:53:18 crc kubenswrapper[4907]: I1129 14:53:18.134738 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-4v7qp" podStartSLOduration=2.134720058 podStartE2EDuration="2.134720058s" podCreationTimestamp="2025-11-29 14:53:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:53:18.120549019 +0000 UTC m=+1496.107386671" watchObservedRunningTime="2025-11-29 14:53:18.134720058 +0000 UTC m=+1496.121557710" Nov 29 14:53:19 crc kubenswrapper[4907]: I1129 14:53:19.070015 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 14:53:19 crc kubenswrapper[4907]: I1129 14:53:19.093431 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:53:19 crc kubenswrapper[4907]: I1129 14:53:19.101707 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4v7qp" event={"ID":"424d2988-9e7a-460f-b621-8268a86daaa5","Type":"ContainerStarted","Data":"e803d9379264c0df1c46bd27b3a7bafe84c3fc57c02f60339de6c2958b84bf58"} Nov 29 14:53:19 crc kubenswrapper[4907]: I1129 14:53:19.101765 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 14:53:19 crc kubenswrapper[4907]: I1129 14:53:19.101789 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 14:53:20 crc kubenswrapper[4907]: I1129 14:53:20.126152 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 14:53:20 crc kubenswrapper[4907]: I1129 14:53:20.126773 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 14:53:20 crc kubenswrapper[4907]: I1129 14:53:20.229067 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 14:53:20 crc kubenswrapper[4907]: I1129 14:53:20.229162 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 14:53:20 crc kubenswrapper[4907]: I1129 14:53:20.232198 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 14:53:20 crc kubenswrapper[4907]: I1129 14:53:20.795345 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 14:53:21 crc kubenswrapper[4907]: I1129 14:53:21.111368 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 14:53:23 crc kubenswrapper[4907]: I1129 14:53:23.157989 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4d7d806d-53d1-4bdb-8469-0ecd96a9896f","Type":"ContainerStarted","Data":"dae4367e392f103ac2489ffa9ab5a315764e5b126883041024383b75f3284440"} Nov 29 14:53:23 crc kubenswrapper[4907]: I1129 14:53:23.160384 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a663d98-2309-4412-a832-12e75ad4098a","Type":"ContainerStarted","Data":"68c2229aca716d602638c9a1c299e8b6da3fdbd6db6e60897ed73d5ac405899b"} Nov 29 14:53:23 crc kubenswrapper[4907]: I1129 14:53:23.162406 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7a92d55-663e-463d-9ce9-60526260e213","Type":"ContainerStarted","Data":"ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5"} Nov 29 14:53:23 crc kubenswrapper[4907]: I1129 14:53:23.163785 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38bf2698-d87c-4bea-87ae-35b657433ba5","Type":"ContainerStarted","Data":"c1cfd3abb4296e9c7f9ed10fc56d9bb2c333bf529f36cd461de9993354f3efb7"} Nov 29 14:53:23 crc kubenswrapper[4907]: I1129 14:53:23.163929 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="38bf2698-d87c-4bea-87ae-35b657433ba5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c1cfd3abb4296e9c7f9ed10fc56d9bb2c333bf529f36cd461de9993354f3efb7" gracePeriod=30 Nov 29 14:53:23 crc kubenswrapper[4907]: I1129 14:53:23.184725 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.920988511 podStartE2EDuration="9.184706771s" podCreationTimestamp="2025-11-29 14:53:14 +0000 UTC" firstStartedPulling="2025-11-29 14:53:16.189968506 +0000 UTC m=+1494.176806158" lastFinishedPulling="2025-11-29 14:53:21.453686766 +0000 UTC m=+1499.440524418" observedRunningTime="2025-11-29 14:53:23.179072222 +0000 UTC m=+1501.165909884" watchObservedRunningTime="2025-11-29 14:53:23.184706771 +0000 UTC m=+1501.171544423" Nov 29 14:53:23 crc kubenswrapper[4907]: I1129 14:53:23.208688 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.196366036 podStartE2EDuration="8.208665997s" podCreationTimestamp="2025-11-29 14:53:15 +0000 UTC" firstStartedPulling="2025-11-29 14:53:16.478840321 +0000 UTC m=+1494.465677983" lastFinishedPulling="2025-11-29 14:53:21.491140302 +0000 UTC m=+1499.477977944" observedRunningTime="2025-11-29 14:53:23.196924286 +0000 UTC m=+1501.183761938" watchObservedRunningTime="2025-11-29 14:53:23.208665997 +0000 UTC m=+1501.195503649" Nov 29 14:53:23 crc kubenswrapper[4907]: I1129 14:53:23.347190 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:23 crc kubenswrapper[4907]: I1129 14:53:23.347546 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="ceilometer-central-agent" containerID="cri-o://55b749439abab66eccca01fb9e5d481ff7ee57d544b7bb0cd4e1d46cc6569ac8" gracePeriod=30 Nov 29 14:53:23 crc kubenswrapper[4907]: I1129 14:53:23.347674 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="proxy-httpd" containerID="cri-o://2b04e48baffd5019ca9f956d06da60efb60b877c9276c7e296f41838605857b3" gracePeriod=30 Nov 29 14:53:23 crc kubenswrapper[4907]: I1129 14:53:23.347732 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="sg-core" containerID="cri-o://5227d6b5314d1986abdb30810b15794976478f895706a8760045e282db5df768" gracePeriod=30 Nov 29 14:53:23 crc kubenswrapper[4907]: I1129 14:53:23.347738 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="ceilometer-notification-agent" containerID="cri-o://c77fa2e5f782596bb383bc5d2f9e1d6a0b8efb6220c4a71b37bc1070c9eafe4d" gracePeriod=30 Nov 29 14:53:23 crc kubenswrapper[4907]: I1129 14:53:23.358638 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.228:3000/\": read tcp 10.217.0.2:53456->10.217.0.228:3000: read: connection reset by peer" Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.189857 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a663d98-2309-4412-a832-12e75ad4098a","Type":"ContainerStarted","Data":"29c0e97bf81479abf03b5f43b19de93c1d8a01ebb694e3925b5fb598faa3f8cf"} Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.214569 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7a92d55-663e-463d-9ce9-60526260e213","Type":"ContainerStarted","Data":"0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e"} Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.214750 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d7a92d55-663e-463d-9ce9-60526260e213" containerName="nova-metadata-log" containerID="cri-o://ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5" gracePeriod=30 Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.214900 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d7a92d55-663e-463d-9ce9-60526260e213" containerName="nova-metadata-metadata" containerID="cri-o://0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e" gracePeriod=30 Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.229673 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.095535818 podStartE2EDuration="9.229655164s" podCreationTimestamp="2025-11-29 14:53:15 +0000 UTC" firstStartedPulling="2025-11-29 14:53:16.319634442 +0000 UTC m=+1494.306472094" lastFinishedPulling="2025-11-29 14:53:21.453753788 +0000 UTC m=+1499.440591440" observedRunningTime="2025-11-29 14:53:24.225100735 +0000 UTC m=+1502.211938387" watchObservedRunningTime="2025-11-29 14:53:24.229655164 +0000 UTC m=+1502.216492816" Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.230370 4907 generic.go:334] "Generic (PLEG): container finished" podID="89b791d6-67df-4388-838e-b44193576abb" containerID="2b04e48baffd5019ca9f956d06da60efb60b877c9276c7e296f41838605857b3" exitCode=0 Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.230410 4907 generic.go:334] "Generic (PLEG): container finished" podID="89b791d6-67df-4388-838e-b44193576abb" containerID="5227d6b5314d1986abdb30810b15794976478f895706a8760045e282db5df768" exitCode=2 Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.230418 4907 generic.go:334] "Generic (PLEG): container finished" podID="89b791d6-67df-4388-838e-b44193576abb" containerID="c77fa2e5f782596bb383bc5d2f9e1d6a0b8efb6220c4a71b37bc1070c9eafe4d" exitCode=0 Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.230425 4907 generic.go:334] "Generic (PLEG): container finished" podID="89b791d6-67df-4388-838e-b44193576abb" containerID="55b749439abab66eccca01fb9e5d481ff7ee57d544b7bb0cd4e1d46cc6569ac8" exitCode=0 Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.231021 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b791d6-67df-4388-838e-b44193576abb","Type":"ContainerDied","Data":"2b04e48baffd5019ca9f956d06da60efb60b877c9276c7e296f41838605857b3"} Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.231056 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b791d6-67df-4388-838e-b44193576abb","Type":"ContainerDied","Data":"5227d6b5314d1986abdb30810b15794976478f895706a8760045e282db5df768"} Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.231067 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b791d6-67df-4388-838e-b44193576abb","Type":"ContainerDied","Data":"c77fa2e5f782596bb383bc5d2f9e1d6a0b8efb6220c4a71b37bc1070c9eafe4d"} Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.231077 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b791d6-67df-4388-838e-b44193576abb","Type":"ContainerDied","Data":"55b749439abab66eccca01fb9e5d481ff7ee57d544b7bb0cd4e1d46cc6569ac8"} Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.253297 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.279658949 podStartE2EDuration="9.25327695s" podCreationTimestamp="2025-11-29 14:53:15 +0000 UTC" firstStartedPulling="2025-11-29 14:53:16.513324123 +0000 UTC m=+1494.500161775" lastFinishedPulling="2025-11-29 14:53:21.486942124 +0000 UTC m=+1499.473779776" observedRunningTime="2025-11-29 14:53:24.244598145 +0000 UTC m=+1502.231435797" watchObservedRunningTime="2025-11-29 14:53:24.25327695 +0000 UTC m=+1502.240114602" Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.784193 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.927511 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b791d6-67df-4388-838e-b44193576abb-log-httpd\") pod \"89b791d6-67df-4388-838e-b44193576abb\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.927843 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b791d6-67df-4388-838e-b44193576abb-run-httpd\") pod \"89b791d6-67df-4388-838e-b44193576abb\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.927955 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm7jd\" (UniqueName: \"kubernetes.io/projected/89b791d6-67df-4388-838e-b44193576abb-kube-api-access-dm7jd\") pod \"89b791d6-67df-4388-838e-b44193576abb\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.928014 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-sg-core-conf-yaml\") pod \"89b791d6-67df-4388-838e-b44193576abb\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.928133 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-config-data\") pod \"89b791d6-67df-4388-838e-b44193576abb\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.928219 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89b791d6-67df-4388-838e-b44193576abb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "89b791d6-67df-4388-838e-b44193576abb" (UID: "89b791d6-67df-4388-838e-b44193576abb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.928253 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-scripts\") pod \"89b791d6-67df-4388-838e-b44193576abb\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.928308 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-combined-ca-bundle\") pod \"89b791d6-67df-4388-838e-b44193576abb\" (UID: \"89b791d6-67df-4388-838e-b44193576abb\") " Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.928739 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89b791d6-67df-4388-838e-b44193576abb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "89b791d6-67df-4388-838e-b44193576abb" (UID: "89b791d6-67df-4388-838e-b44193576abb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.929330 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b791d6-67df-4388-838e-b44193576abb-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.929344 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89b791d6-67df-4388-838e-b44193576abb-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.937950 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-scripts" (OuterVolumeSpecName: "scripts") pod "89b791d6-67df-4388-838e-b44193576abb" (UID: "89b791d6-67df-4388-838e-b44193576abb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.970707 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89b791d6-67df-4388-838e-b44193576abb-kube-api-access-dm7jd" (OuterVolumeSpecName: "kube-api-access-dm7jd") pod "89b791d6-67df-4388-838e-b44193576abb" (UID: "89b791d6-67df-4388-838e-b44193576abb"). InnerVolumeSpecName "kube-api-access-dm7jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:24 crc kubenswrapper[4907]: I1129 14:53:24.984741 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "89b791d6-67df-4388-838e-b44193576abb" (UID: "89b791d6-67df-4388-838e-b44193576abb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.074700 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm7jd\" (UniqueName: \"kubernetes.io/projected/89b791d6-67df-4388-838e-b44193576abb-kube-api-access-dm7jd\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.074731 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.074753 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.155522 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89b791d6-67df-4388-838e-b44193576abb" (UID: "89b791d6-67df-4388-838e-b44193576abb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.176952 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.199396 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-config-data" (OuterVolumeSpecName: "config-data") pod "89b791d6-67df-4388-838e-b44193576abb" (UID: "89b791d6-67df-4388-838e-b44193576abb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.209723 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.258071 4907 generic.go:334] "Generic (PLEG): container finished" podID="d7a92d55-663e-463d-9ce9-60526260e213" containerID="0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e" exitCode=0 Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.258100 4907 generic.go:334] "Generic (PLEG): container finished" podID="d7a92d55-663e-463d-9ce9-60526260e213" containerID="ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5" exitCode=143 Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.258232 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.258243 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7a92d55-663e-463d-9ce9-60526260e213","Type":"ContainerDied","Data":"0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e"} Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.258280 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7a92d55-663e-463d-9ce9-60526260e213","Type":"ContainerDied","Data":"ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5"} Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.258292 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7a92d55-663e-463d-9ce9-60526260e213","Type":"ContainerDied","Data":"d8c267a2b34043a1ddfea199b501d647dde5dd2658092448a6de4705e7342e24"} Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.258317 4907 scope.go:117] "RemoveContainer" containerID="0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.278343 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89b791d6-67df-4388-838e-b44193576abb","Type":"ContainerDied","Data":"11950eee35ac91975bf9d57f4978a276deede60fe21986ecc322488bec0210e2"} Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.279944 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g59z6\" (UniqueName: \"kubernetes.io/projected/d7a92d55-663e-463d-9ce9-60526260e213-kube-api-access-g59z6\") pod \"d7a92d55-663e-463d-9ce9-60526260e213\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.280002 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a92d55-663e-463d-9ce9-60526260e213-combined-ca-bundle\") pod \"d7a92d55-663e-463d-9ce9-60526260e213\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.280032 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a92d55-663e-463d-9ce9-60526260e213-config-data\") pod \"d7a92d55-663e-463d-9ce9-60526260e213\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.280088 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7a92d55-663e-463d-9ce9-60526260e213-logs\") pod \"d7a92d55-663e-463d-9ce9-60526260e213\" (UID: \"d7a92d55-663e-463d-9ce9-60526260e213\") " Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.281066 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7a92d55-663e-463d-9ce9-60526260e213-logs" (OuterVolumeSpecName: "logs") pod "d7a92d55-663e-463d-9ce9-60526260e213" (UID: "d7a92d55-663e-463d-9ce9-60526260e213"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.281698 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7a92d55-663e-463d-9ce9-60526260e213-logs\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.281722 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89b791d6-67df-4388-838e-b44193576abb-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.282607 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.286741 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7a92d55-663e-463d-9ce9-60526260e213-kube-api-access-g59z6" (OuterVolumeSpecName: "kube-api-access-g59z6") pod "d7a92d55-663e-463d-9ce9-60526260e213" (UID: "d7a92d55-663e-463d-9ce9-60526260e213"). InnerVolumeSpecName "kube-api-access-g59z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.287169 4907 scope.go:117] "RemoveContainer" containerID="ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.315453 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a92d55-663e-463d-9ce9-60526260e213-config-data" (OuterVolumeSpecName: "config-data") pod "d7a92d55-663e-463d-9ce9-60526260e213" (UID: "d7a92d55-663e-463d-9ce9-60526260e213"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.328776 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a92d55-663e-463d-9ce9-60526260e213-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7a92d55-663e-463d-9ce9-60526260e213" (UID: "d7a92d55-663e-463d-9ce9-60526260e213"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.334545 4907 scope.go:117] "RemoveContainer" containerID="0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e" Nov 29 14:53:25 crc kubenswrapper[4907]: E1129 14:53:25.338002 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e\": container with ID starting with 0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e not found: ID does not exist" containerID="0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.338050 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e"} err="failed to get container status \"0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e\": rpc error: code = NotFound desc = could not find container \"0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e\": container with ID starting with 0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e not found: ID does not exist" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.338076 4907 scope.go:117] "RemoveContainer" containerID="ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5" Nov 29 14:53:25 crc kubenswrapper[4907]: E1129 14:53:25.340045 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5\": container with ID starting with ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5 not found: ID does not exist" containerID="ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.340092 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5"} err="failed to get container status \"ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5\": rpc error: code = NotFound desc = could not find container \"ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5\": container with ID starting with ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5 not found: ID does not exist" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.340116 4907 scope.go:117] "RemoveContainer" containerID="0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.343696 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e"} err="failed to get container status \"0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e\": rpc error: code = NotFound desc = could not find container \"0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e\": container with ID starting with 0c4c49515edb7a7c1fdf615f7baa21b0de76c6c0de83b8038f016587bfbf107e not found: ID does not exist" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.343740 4907 scope.go:117] "RemoveContainer" containerID="ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.346166 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5"} err="failed to get container status \"ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5\": rpc error: code = NotFound desc = could not find container \"ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5\": container with ID starting with ab3bcc16852e5d539937275f12472dca42ade83fd63cb70112f301e2e096caf5 not found: ID does not exist" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.346206 4907 scope.go:117] "RemoveContainer" containerID="2b04e48baffd5019ca9f956d06da60efb60b877c9276c7e296f41838605857b3" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.359784 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.370151 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.377726 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:25 crc kubenswrapper[4907]: E1129 14:53:25.378172 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="sg-core" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.378188 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="sg-core" Nov 29 14:53:25 crc kubenswrapper[4907]: E1129 14:53:25.378206 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a92d55-663e-463d-9ce9-60526260e213" containerName="nova-metadata-log" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.378212 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a92d55-663e-463d-9ce9-60526260e213" containerName="nova-metadata-log" Nov 29 14:53:25 crc kubenswrapper[4907]: E1129 14:53:25.378241 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="ceilometer-notification-agent" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.378247 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="ceilometer-notification-agent" Nov 29 14:53:25 crc kubenswrapper[4907]: E1129 14:53:25.378261 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a92d55-663e-463d-9ce9-60526260e213" containerName="nova-metadata-metadata" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.378267 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a92d55-663e-463d-9ce9-60526260e213" containerName="nova-metadata-metadata" Nov 29 14:53:25 crc kubenswrapper[4907]: E1129 14:53:25.378284 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="proxy-httpd" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.378291 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="proxy-httpd" Nov 29 14:53:25 crc kubenswrapper[4907]: E1129 14:53:25.378301 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="ceilometer-central-agent" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.378307 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="ceilometer-central-agent" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.378546 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="ceilometer-notification-agent" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.378566 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="proxy-httpd" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.378579 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a92d55-663e-463d-9ce9-60526260e213" containerName="nova-metadata-log" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.378587 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="ceilometer-central-agent" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.378596 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a92d55-663e-463d-9ce9-60526260e213" containerName="nova-metadata-metadata" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.378613 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b791d6-67df-4388-838e-b44193576abb" containerName="sg-core" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.380553 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.380579 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.380816 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.383002 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g59z6\" (UniqueName: \"kubernetes.io/projected/d7a92d55-663e-463d-9ce9-60526260e213-kube-api-access-g59z6\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.383024 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a92d55-663e-463d-9ce9-60526260e213-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.383033 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a92d55-663e-463d-9ce9-60526260e213-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.383924 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.384095 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.393068 4907 scope.go:117] "RemoveContainer" containerID="5227d6b5314d1986abdb30810b15794976478f895706a8760045e282db5df768" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.401743 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.438340 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.441552 4907 scope.go:117] "RemoveContainer" containerID="c77fa2e5f782596bb383bc5d2f9e1d6a0b8efb6220c4a71b37bc1070c9eafe4d" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.467379 4907 scope.go:117] "RemoveContainer" containerID="55b749439abab66eccca01fb9e5d481ff7ee57d544b7bb0cd4e1d46cc6569ac8" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.484476 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-config-data\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.484519 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.484623 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9699098b-28d9-42d8-b43e-da4d4a655b8e-log-httpd\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.484689 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q88j\" (UniqueName: \"kubernetes.io/projected/9699098b-28d9-42d8-b43e-da4d4a655b8e-kube-api-access-2q88j\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.484732 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-scripts\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.484820 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.484963 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9699098b-28d9-42d8-b43e-da4d4a655b8e-run-httpd\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.563075 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.563200 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.587624 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9699098b-28d9-42d8-b43e-da4d4a655b8e-run-httpd\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.587803 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-config-data\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.587872 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.587939 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9699098b-28d9-42d8-b43e-da4d4a655b8e-log-httpd\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.587973 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q88j\" (UniqueName: \"kubernetes.io/projected/9699098b-28d9-42d8-b43e-da4d4a655b8e-kube-api-access-2q88j\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.588029 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-scripts\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.588143 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.588340 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9699098b-28d9-42d8-b43e-da4d4a655b8e-run-httpd\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.588524 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9699098b-28d9-42d8-b43e-da4d4a655b8e-log-httpd\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.592187 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-scripts\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.592671 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.592946 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.593575 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-config-data\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.596840 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.612130 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.616281 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q88j\" (UniqueName: \"kubernetes.io/projected/9699098b-28d9-42d8-b43e-da4d4a655b8e-kube-api-access-2q88j\") pod \"ceilometer-0\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.623677 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.624503 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.626785 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.630711 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.630901 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.635800 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.694722 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.695008 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91243c38-88da-4f49-b96d-07d724fa8f03-logs\") pod \"nova-metadata-0\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.695111 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtxrz\" (UniqueName: \"kubernetes.io/projected/91243c38-88da-4f49-b96d-07d724fa8f03-kube-api-access-xtxrz\") pod \"nova-metadata-0\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.695352 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-config-data\") pod \"nova-metadata-0\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.695476 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.707918 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.801921 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91243c38-88da-4f49-b96d-07d724fa8f03-logs\") pod \"nova-metadata-0\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.802203 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtxrz\" (UniqueName: \"kubernetes.io/projected/91243c38-88da-4f49-b96d-07d724fa8f03-kube-api-access-xtxrz\") pod \"nova-metadata-0\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.802418 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-config-data\") pod \"nova-metadata-0\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.802536 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.802667 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.806172 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91243c38-88da-4f49-b96d-07d724fa8f03-logs\") pod \"nova-metadata-0\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.806992 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.811109 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.814886 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-config-data\") pod \"nova-metadata-0\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.838615 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.843004 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtxrz\" (UniqueName: \"kubernetes.io/projected/91243c38-88da-4f49-b96d-07d724fa8f03-kube-api-access-xtxrz\") pod \"nova-metadata-0\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.953014 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 14:53:25 crc kubenswrapper[4907]: I1129 14:53:25.964421 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-txdv6"] Nov 29 14:53:26 crc kubenswrapper[4907]: I1129 14:53:26.291945 4907 generic.go:334] "Generic (PLEG): container finished" podID="9701e823-f11a-4ce6-85de-80f705092f11" containerID="92b054cfe0a423f39cbc1c63cedaf00c15b521d76ff5f25c7b86233d09047ffb" exitCode=0 Nov 29 14:53:26 crc kubenswrapper[4907]: I1129 14:53:26.292241 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gghpv" event={"ID":"9701e823-f11a-4ce6-85de-80f705092f11","Type":"ContainerDied","Data":"92b054cfe0a423f39cbc1c63cedaf00c15b521d76ff5f25c7b86233d09047ffb"} Nov 29 14:53:26 crc kubenswrapper[4907]: I1129 14:53:26.300913 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" podUID="3dbbcded-256c-4054-9ba2-a7b1bde35aea" containerName="dnsmasq-dns" containerID="cri-o://3bef29de5a29be4092c04133441b92f471a10f690aae50445330e2020cee03db" gracePeriod=10 Nov 29 14:53:26 crc kubenswrapper[4907]: I1129 14:53:26.353566 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 29 14:53:26 crc kubenswrapper[4907]: I1129 14:53:26.380646 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:26 crc kubenswrapper[4907]: I1129 14:53:26.508877 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89b791d6-67df-4388-838e-b44193576abb" path="/var/lib/kubelet/pods/89b791d6-67df-4388-838e-b44193576abb/volumes" Nov 29 14:53:26 crc kubenswrapper[4907]: I1129 14:53:26.509955 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a92d55-663e-463d-9ce9-60526260e213" path="/var/lib/kubelet/pods/d7a92d55-663e-463d-9ce9-60526260e213/volumes" Nov 29 14:53:26 crc kubenswrapper[4907]: W1129 14:53:26.586309 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91243c38_88da_4f49_b96d_07d724fa8f03.slice/crio-ef277806b9d6439e32d789fbe27a41d543bc2b0280a675e009628ae492951c28 WatchSource:0}: Error finding container ef277806b9d6439e32d789fbe27a41d543bc2b0280a675e009628ae492951c28: Status 404 returned error can't find the container with id ef277806b9d6439e32d789fbe27a41d543bc2b0280a675e009628ae492951c28 Nov 29 14:53:26 crc kubenswrapper[4907]: I1129 14:53:26.589541 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:53:26 crc kubenswrapper[4907]: I1129 14:53:26.646654 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2a663d98-2309-4412-a832-12e75ad4098a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.231:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 14:53:26 crc kubenswrapper[4907]: I1129 14:53:26.646668 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2a663d98-2309-4412-a832-12e75ad4098a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.231:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 14:53:26 crc kubenswrapper[4907]: I1129 14:53:26.902396 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.054384 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-dns-swift-storage-0\") pod \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.054544 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-ovsdbserver-sb\") pod \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.054725 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-dns-svc\") pod \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.054800 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8d26\" (UniqueName: \"kubernetes.io/projected/3dbbcded-256c-4054-9ba2-a7b1bde35aea-kube-api-access-l8d26\") pod \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.054876 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-config\") pod \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.054941 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-ovsdbserver-nb\") pod \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\" (UID: \"3dbbcded-256c-4054-9ba2-a7b1bde35aea\") " Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.083626 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dbbcded-256c-4054-9ba2-a7b1bde35aea-kube-api-access-l8d26" (OuterVolumeSpecName: "kube-api-access-l8d26") pod "3dbbcded-256c-4054-9ba2-a7b1bde35aea" (UID: "3dbbcded-256c-4054-9ba2-a7b1bde35aea"). InnerVolumeSpecName "kube-api-access-l8d26". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.169059 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8d26\" (UniqueName: \"kubernetes.io/projected/3dbbcded-256c-4054-9ba2-a7b1bde35aea-kube-api-access-l8d26\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.211149 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-config" (OuterVolumeSpecName: "config") pod "3dbbcded-256c-4054-9ba2-a7b1bde35aea" (UID: "3dbbcded-256c-4054-9ba2-a7b1bde35aea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.235419 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3dbbcded-256c-4054-9ba2-a7b1bde35aea" (UID: "3dbbcded-256c-4054-9ba2-a7b1bde35aea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.270734 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3dbbcded-256c-4054-9ba2-a7b1bde35aea" (UID: "3dbbcded-256c-4054-9ba2-a7b1bde35aea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.278146 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.278176 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.278190 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.287520 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s9wpf"] Nov 29 14:53:27 crc kubenswrapper[4907]: E1129 14:53:27.288275 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dbbcded-256c-4054-9ba2-a7b1bde35aea" containerName="dnsmasq-dns" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.288345 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dbbcded-256c-4054-9ba2-a7b1bde35aea" containerName="dnsmasq-dns" Nov 29 14:53:27 crc kubenswrapper[4907]: E1129 14:53:27.288429 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dbbcded-256c-4054-9ba2-a7b1bde35aea" containerName="init" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.288514 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dbbcded-256c-4054-9ba2-a7b1bde35aea" containerName="init" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.288862 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dbbcded-256c-4054-9ba2-a7b1bde35aea" containerName="dnsmasq-dns" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.290702 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.297505 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9wpf"] Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.298413 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3dbbcded-256c-4054-9ba2-a7b1bde35aea" (UID: "3dbbcded-256c-4054-9ba2-a7b1bde35aea"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.325352 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9699098b-28d9-42d8-b43e-da4d4a655b8e","Type":"ContainerStarted","Data":"a09c9b921716c79afe50079e43f4d9817be3c1b9e87a9fda653cb227c7d2d83a"} Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.327155 4907 generic.go:334] "Generic (PLEG): container finished" podID="3dbbcded-256c-4054-9ba2-a7b1bde35aea" containerID="3bef29de5a29be4092c04133441b92f471a10f690aae50445330e2020cee03db" exitCode=0 Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.327198 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" event={"ID":"3dbbcded-256c-4054-9ba2-a7b1bde35aea","Type":"ContainerDied","Data":"3bef29de5a29be4092c04133441b92f471a10f690aae50445330e2020cee03db"} Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.327215 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" event={"ID":"3dbbcded-256c-4054-9ba2-a7b1bde35aea","Type":"ContainerDied","Data":"d977c1cc4cf526dd4bdaf7e4b8a62b5ef27168eec96f92fbab4252fb6f49fd14"} Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.327233 4907 scope.go:117] "RemoveContainer" containerID="3bef29de5a29be4092c04133441b92f471a10f690aae50445330e2020cee03db" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.327386 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-txdv6" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.342229 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"91243c38-88da-4f49-b96d-07d724fa8f03","Type":"ContainerStarted","Data":"9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2"} Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.342268 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"91243c38-88da-4f49-b96d-07d724fa8f03","Type":"ContainerStarted","Data":"ef277806b9d6439e32d789fbe27a41d543bc2b0280a675e009628ae492951c28"} Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.382901 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-utilities\") pod \"certified-operators-s9wpf\" (UID: \"9bd02179-a472-4e40-bd4c-4c6cd9a5155d\") " pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.382989 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-catalog-content\") pod \"certified-operators-s9wpf\" (UID: \"9bd02179-a472-4e40-bd4c-4c6cd9a5155d\") " pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.383040 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl885\" (UniqueName: \"kubernetes.io/projected/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-kube-api-access-kl885\") pod \"certified-operators-s9wpf\" (UID: \"9bd02179-a472-4e40-bd4c-4c6cd9a5155d\") " pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.383170 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.388944 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3dbbcded-256c-4054-9ba2-a7b1bde35aea" (UID: "3dbbcded-256c-4054-9ba2-a7b1bde35aea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.392520 4907 scope.go:117] "RemoveContainer" containerID="692e03ce2127d95d592ee435df7b8722f8952bc2f7d1601a0d0f197703f593dc" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.442357 4907 scope.go:117] "RemoveContainer" containerID="3bef29de5a29be4092c04133441b92f471a10f690aae50445330e2020cee03db" Nov 29 14:53:27 crc kubenswrapper[4907]: E1129 14:53:27.445417 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bef29de5a29be4092c04133441b92f471a10f690aae50445330e2020cee03db\": container with ID starting with 3bef29de5a29be4092c04133441b92f471a10f690aae50445330e2020cee03db not found: ID does not exist" containerID="3bef29de5a29be4092c04133441b92f471a10f690aae50445330e2020cee03db" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.445479 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bef29de5a29be4092c04133441b92f471a10f690aae50445330e2020cee03db"} err="failed to get container status \"3bef29de5a29be4092c04133441b92f471a10f690aae50445330e2020cee03db\": rpc error: code = NotFound desc = could not find container \"3bef29de5a29be4092c04133441b92f471a10f690aae50445330e2020cee03db\": container with ID starting with 3bef29de5a29be4092c04133441b92f471a10f690aae50445330e2020cee03db not found: ID does not exist" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.445500 4907 scope.go:117] "RemoveContainer" containerID="692e03ce2127d95d592ee435df7b8722f8952bc2f7d1601a0d0f197703f593dc" Nov 29 14:53:27 crc kubenswrapper[4907]: E1129 14:53:27.450823 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"692e03ce2127d95d592ee435df7b8722f8952bc2f7d1601a0d0f197703f593dc\": container with ID starting with 692e03ce2127d95d592ee435df7b8722f8952bc2f7d1601a0d0f197703f593dc not found: ID does not exist" containerID="692e03ce2127d95d592ee435df7b8722f8952bc2f7d1601a0d0f197703f593dc" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.450850 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692e03ce2127d95d592ee435df7b8722f8952bc2f7d1601a0d0f197703f593dc"} err="failed to get container status \"692e03ce2127d95d592ee435df7b8722f8952bc2f7d1601a0d0f197703f593dc\": rpc error: code = NotFound desc = could not find container \"692e03ce2127d95d592ee435df7b8722f8952bc2f7d1601a0d0f197703f593dc\": container with ID starting with 692e03ce2127d95d592ee435df7b8722f8952bc2f7d1601a0d0f197703f593dc not found: ID does not exist" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.489948 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-utilities\") pod \"certified-operators-s9wpf\" (UID: \"9bd02179-a472-4e40-bd4c-4c6cd9a5155d\") " pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.490035 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-catalog-content\") pod \"certified-operators-s9wpf\" (UID: \"9bd02179-a472-4e40-bd4c-4c6cd9a5155d\") " pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.490105 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl885\" (UniqueName: \"kubernetes.io/projected/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-kube-api-access-kl885\") pod \"certified-operators-s9wpf\" (UID: \"9bd02179-a472-4e40-bd4c-4c6cd9a5155d\") " pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.490185 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dbbcded-256c-4054-9ba2-a7b1bde35aea-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.490827 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-utilities\") pod \"certified-operators-s9wpf\" (UID: \"9bd02179-a472-4e40-bd4c-4c6cd9a5155d\") " pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.491168 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-catalog-content\") pod \"certified-operators-s9wpf\" (UID: \"9bd02179-a472-4e40-bd4c-4c6cd9a5155d\") " pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.512895 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl885\" (UniqueName: \"kubernetes.io/projected/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-kube-api-access-kl885\") pod \"certified-operators-s9wpf\" (UID: \"9bd02179-a472-4e40-bd4c-4c6cd9a5155d\") " pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.664268 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.664741 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-txdv6"] Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.692059 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-txdv6"] Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.728774 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.809014 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-combined-ca-bundle\") pod \"9701e823-f11a-4ce6-85de-80f705092f11\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.809322 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-scripts\") pod \"9701e823-f11a-4ce6-85de-80f705092f11\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.809354 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2vkt\" (UniqueName: \"kubernetes.io/projected/9701e823-f11a-4ce6-85de-80f705092f11-kube-api-access-l2vkt\") pod \"9701e823-f11a-4ce6-85de-80f705092f11\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.809415 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-config-data\") pod \"9701e823-f11a-4ce6-85de-80f705092f11\" (UID: \"9701e823-f11a-4ce6-85de-80f705092f11\") " Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.820637 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-scripts" (OuterVolumeSpecName: "scripts") pod "9701e823-f11a-4ce6-85de-80f705092f11" (UID: "9701e823-f11a-4ce6-85de-80f705092f11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.826637 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9701e823-f11a-4ce6-85de-80f705092f11-kube-api-access-l2vkt" (OuterVolumeSpecName: "kube-api-access-l2vkt") pod "9701e823-f11a-4ce6-85de-80f705092f11" (UID: "9701e823-f11a-4ce6-85de-80f705092f11"). InnerVolumeSpecName "kube-api-access-l2vkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.859639 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9701e823-f11a-4ce6-85de-80f705092f11" (UID: "9701e823-f11a-4ce6-85de-80f705092f11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.894553 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-config-data" (OuterVolumeSpecName: "config-data") pod "9701e823-f11a-4ce6-85de-80f705092f11" (UID: "9701e823-f11a-4ce6-85de-80f705092f11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.911457 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.911490 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2vkt\" (UniqueName: \"kubernetes.io/projected/9701e823-f11a-4ce6-85de-80f705092f11-kube-api-access-l2vkt\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.911500 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:27 crc kubenswrapper[4907]: I1129 14:53:27.911508 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9701e823-f11a-4ce6-85de-80f705092f11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.120187 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-56fkz"] Nov 29 14:53:28 crc kubenswrapper[4907]: E1129 14:53:28.120915 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9701e823-f11a-4ce6-85de-80f705092f11" containerName="nova-manage" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.120946 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9701e823-f11a-4ce6-85de-80f705092f11" containerName="nova-manage" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.121282 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9701e823-f11a-4ce6-85de-80f705092f11" containerName="nova-manage" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.122535 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-56fkz" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.131625 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-56fkz"] Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.228414 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhdzl\" (UniqueName: \"kubernetes.io/projected/5a4137fc-d9f8-46ae-9740-cf388fcb54f1-kube-api-access-fhdzl\") pod \"aodh-db-create-56fkz\" (UID: \"5a4137fc-d9f8-46ae-9740-cf388fcb54f1\") " pod="openstack/aodh-db-create-56fkz" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.228757 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a4137fc-d9f8-46ae-9740-cf388fcb54f1-operator-scripts\") pod \"aodh-db-create-56fkz\" (UID: \"5a4137fc-d9f8-46ae-9740-cf388fcb54f1\") " pod="openstack/aodh-db-create-56fkz" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.233365 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0272-account-create-update-zc6hq"] Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.234789 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0272-account-create-update-zc6hq" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.237940 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.256703 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0272-account-create-update-zc6hq"] Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.326819 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9wpf"] Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.332603 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhdzl\" (UniqueName: \"kubernetes.io/projected/5a4137fc-d9f8-46ae-9740-cf388fcb54f1-kube-api-access-fhdzl\") pod \"aodh-db-create-56fkz\" (UID: \"5a4137fc-d9f8-46ae-9740-cf388fcb54f1\") " pod="openstack/aodh-db-create-56fkz" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.332703 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd168f22-6342-4cce-95b4-3793763c8b41-operator-scripts\") pod \"aodh-0272-account-create-update-zc6hq\" (UID: \"bd168f22-6342-4cce-95b4-3793763c8b41\") " pod="openstack/aodh-0272-account-create-update-zc6hq" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.332814 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw9q9\" (UniqueName: \"kubernetes.io/projected/bd168f22-6342-4cce-95b4-3793763c8b41-kube-api-access-vw9q9\") pod \"aodh-0272-account-create-update-zc6hq\" (UID: \"bd168f22-6342-4cce-95b4-3793763c8b41\") " pod="openstack/aodh-0272-account-create-update-zc6hq" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.333057 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a4137fc-d9f8-46ae-9740-cf388fcb54f1-operator-scripts\") pod \"aodh-db-create-56fkz\" (UID: \"5a4137fc-d9f8-46ae-9740-cf388fcb54f1\") " pod="openstack/aodh-db-create-56fkz" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.334330 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a4137fc-d9f8-46ae-9740-cf388fcb54f1-operator-scripts\") pod \"aodh-db-create-56fkz\" (UID: \"5a4137fc-d9f8-46ae-9740-cf388fcb54f1\") " pod="openstack/aodh-db-create-56fkz" Nov 29 14:53:28 crc kubenswrapper[4907]: W1129 14:53:28.348762 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bd02179_a472_4e40_bd4c_4c6cd9a5155d.slice/crio-161a89a75134649622dead331100b40eb963cba4c902ee5731099e61f129cde4 WatchSource:0}: Error finding container 161a89a75134649622dead331100b40eb963cba4c902ee5731099e61f129cde4: Status 404 returned error can't find the container with id 161a89a75134649622dead331100b40eb963cba4c902ee5731099e61f129cde4 Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.370239 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhdzl\" (UniqueName: \"kubernetes.io/projected/5a4137fc-d9f8-46ae-9740-cf388fcb54f1-kube-api-access-fhdzl\") pod \"aodh-db-create-56fkz\" (UID: \"5a4137fc-d9f8-46ae-9740-cf388fcb54f1\") " pod="openstack/aodh-db-create-56fkz" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.431305 4907 generic.go:334] "Generic (PLEG): container finished" podID="424d2988-9e7a-460f-b621-8268a86daaa5" containerID="e803d9379264c0df1c46bd27b3a7bafe84c3fc57c02f60339de6c2958b84bf58" exitCode=0 Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.431397 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4v7qp" event={"ID":"424d2988-9e7a-460f-b621-8268a86daaa5","Type":"ContainerDied","Data":"e803d9379264c0df1c46bd27b3a7bafe84c3fc57c02f60339de6c2958b84bf58"} Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.435956 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd168f22-6342-4cce-95b4-3793763c8b41-operator-scripts\") pod \"aodh-0272-account-create-update-zc6hq\" (UID: \"bd168f22-6342-4cce-95b4-3793763c8b41\") " pod="openstack/aodh-0272-account-create-update-zc6hq" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.436069 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw9q9\" (UniqueName: \"kubernetes.io/projected/bd168f22-6342-4cce-95b4-3793763c8b41-kube-api-access-vw9q9\") pod \"aodh-0272-account-create-update-zc6hq\" (UID: \"bd168f22-6342-4cce-95b4-3793763c8b41\") " pod="openstack/aodh-0272-account-create-update-zc6hq" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.436682 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd168f22-6342-4cce-95b4-3793763c8b41-operator-scripts\") pod \"aodh-0272-account-create-update-zc6hq\" (UID: \"bd168f22-6342-4cce-95b4-3793763c8b41\") " pod="openstack/aodh-0272-account-create-update-zc6hq" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.441759 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"91243c38-88da-4f49-b96d-07d724fa8f03","Type":"ContainerStarted","Data":"02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b"} Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.466145 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw9q9\" (UniqueName: \"kubernetes.io/projected/bd168f22-6342-4cce-95b4-3793763c8b41-kube-api-access-vw9q9\") pod \"aodh-0272-account-create-update-zc6hq\" (UID: \"bd168f22-6342-4cce-95b4-3793763c8b41\") " pod="openstack/aodh-0272-account-create-update-zc6hq" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.468557 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9699098b-28d9-42d8-b43e-da4d4a655b8e","Type":"ContainerStarted","Data":"aafd8f938f2617b87b37ff985994451ac28aeadc5356a23fdea264fa9979f417"} Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.468598 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9699098b-28d9-42d8-b43e-da4d4a655b8e","Type":"ContainerStarted","Data":"6ae044a335c0711e295c7c6e18a546c7690608b10c13f2d2d7ababdcde020aa0"} Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.473507 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.490689 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-56fkz" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.516391 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dbbcded-256c-4054-9ba2-a7b1bde35aea" path="/var/lib/kubelet/pods/3dbbcded-256c-4054-9ba2-a7b1bde35aea/volumes" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.520170 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2a663d98-2309-4412-a832-12e75ad4098a" containerName="nova-api-log" containerID="cri-o://68c2229aca716d602638c9a1c299e8b6da3fdbd6db6e60897ed73d5ac405899b" gracePeriod=30 Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.520467 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gghpv" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.521677 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gghpv" event={"ID":"9701e823-f11a-4ce6-85de-80f705092f11","Type":"ContainerDied","Data":"ad4ed4e48b64a119d69b78aae10f4c1ac3210e467904fa8161b6887d894ae3e8"} Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.521724 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad4ed4e48b64a119d69b78aae10f4c1ac3210e467904fa8161b6887d894ae3e8" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.521771 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2a663d98-2309-4412-a832-12e75ad4098a" containerName="nova-api-api" containerID="cri-o://29c0e97bf81479abf03b5f43b19de93c1d8a01ebb694e3925b5fb598faa3f8cf" gracePeriod=30 Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.528648 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.528625773 podStartE2EDuration="3.528625773s" podCreationTimestamp="2025-11-29 14:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:53:28.49446275 +0000 UTC m=+1506.481300402" watchObservedRunningTime="2025-11-29 14:53:28.528625773 +0000 UTC m=+1506.515463415" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.568501 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.568683 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4d7d806d-53d1-4bdb-8469-0ecd96a9896f" containerName="nova-scheduler-scheduler" containerID="cri-o://dae4367e392f103ac2489ffa9ab5a315764e5b126883041024383b75f3284440" gracePeriod=30 Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.569397 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0272-account-create-update-zc6hq" Nov 29 14:53:28 crc kubenswrapper[4907]: I1129 14:53:28.582597 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:53:29 crc kubenswrapper[4907]: I1129 14:53:29.522903 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0272-account-create-update-zc6hq"] Nov 29 14:53:29 crc kubenswrapper[4907]: I1129 14:53:29.549517 4907 generic.go:334] "Generic (PLEG): container finished" podID="2a663d98-2309-4412-a832-12e75ad4098a" containerID="68c2229aca716d602638c9a1c299e8b6da3fdbd6db6e60897ed73d5ac405899b" exitCode=143 Nov 29 14:53:29 crc kubenswrapper[4907]: I1129 14:53:29.549615 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a663d98-2309-4412-a832-12e75ad4098a","Type":"ContainerDied","Data":"68c2229aca716d602638c9a1c299e8b6da3fdbd6db6e60897ed73d5ac405899b"} Nov 29 14:53:29 crc kubenswrapper[4907]: I1129 14:53:29.557154 4907 generic.go:334] "Generic (PLEG): container finished" podID="9bd02179-a472-4e40-bd4c-4c6cd9a5155d" containerID="24fccc4d6722769af734c895de91a977e92c669646e950ddbe6b19335ea8a27f" exitCode=0 Nov 29 14:53:29 crc kubenswrapper[4907]: I1129 14:53:29.557217 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9wpf" event={"ID":"9bd02179-a472-4e40-bd4c-4c6cd9a5155d","Type":"ContainerDied","Data":"24fccc4d6722769af734c895de91a977e92c669646e950ddbe6b19335ea8a27f"} Nov 29 14:53:29 crc kubenswrapper[4907]: I1129 14:53:29.557243 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9wpf" event={"ID":"9bd02179-a472-4e40-bd4c-4c6cd9a5155d","Type":"ContainerStarted","Data":"161a89a75134649622dead331100b40eb963cba4c902ee5731099e61f129cde4"} Nov 29 14:53:29 crc kubenswrapper[4907]: I1129 14:53:29.559789 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-56fkz"] Nov 29 14:53:29 crc kubenswrapper[4907]: I1129 14:53:29.563575 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9699098b-28d9-42d8-b43e-da4d4a655b8e","Type":"ContainerStarted","Data":"a5e21dfe54f590bf40eea699e38869dcb68c98b667e5f2144c85826a3d8a9302"} Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.347578 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.354042 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.396322 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-combined-ca-bundle\") pod \"4d7d806d-53d1-4bdb-8469-0ecd96a9896f\" (UID: \"4d7d806d-53d1-4bdb-8469-0ecd96a9896f\") " Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.397069 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-config-data\") pod \"4d7d806d-53d1-4bdb-8469-0ecd96a9896f\" (UID: \"4d7d806d-53d1-4bdb-8469-0ecd96a9896f\") " Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.397225 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn8qh\" (UniqueName: \"kubernetes.io/projected/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-kube-api-access-qn8qh\") pod \"4d7d806d-53d1-4bdb-8469-0ecd96a9896f\" (UID: \"4d7d806d-53d1-4bdb-8469-0ecd96a9896f\") " Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.422510 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-kube-api-access-qn8qh" (OuterVolumeSpecName: "kube-api-access-qn8qh") pod "4d7d806d-53d1-4bdb-8469-0ecd96a9896f" (UID: "4d7d806d-53d1-4bdb-8469-0ecd96a9896f"). InnerVolumeSpecName "kube-api-access-qn8qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.452204 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d7d806d-53d1-4bdb-8469-0ecd96a9896f" (UID: "4d7d806d-53d1-4bdb-8469-0ecd96a9896f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.477628 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-config-data" (OuterVolumeSpecName: "config-data") pod "4d7d806d-53d1-4bdb-8469-0ecd96a9896f" (UID: "4d7d806d-53d1-4bdb-8469-0ecd96a9896f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.499214 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-combined-ca-bundle\") pod \"424d2988-9e7a-460f-b621-8268a86daaa5\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.499274 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-scripts\") pod \"424d2988-9e7a-460f-b621-8268a86daaa5\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.499467 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-config-data\") pod \"424d2988-9e7a-460f-b621-8268a86daaa5\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.499574 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b9sm\" (UniqueName: \"kubernetes.io/projected/424d2988-9e7a-460f-b621-8268a86daaa5-kube-api-access-7b9sm\") pod \"424d2988-9e7a-460f-b621-8268a86daaa5\" (UID: \"424d2988-9e7a-460f-b621-8268a86daaa5\") " Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.500145 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.500162 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn8qh\" (UniqueName: \"kubernetes.io/projected/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-kube-api-access-qn8qh\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.500171 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d7d806d-53d1-4bdb-8469-0ecd96a9896f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.503993 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-scripts" (OuterVolumeSpecName: "scripts") pod "424d2988-9e7a-460f-b621-8268a86daaa5" (UID: "424d2988-9e7a-460f-b621-8268a86daaa5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.517430 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/424d2988-9e7a-460f-b621-8268a86daaa5-kube-api-access-7b9sm" (OuterVolumeSpecName: "kube-api-access-7b9sm") pod "424d2988-9e7a-460f-b621-8268a86daaa5" (UID: "424d2988-9e7a-460f-b621-8268a86daaa5"). InnerVolumeSpecName "kube-api-access-7b9sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.551223 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 14:53:30 crc kubenswrapper[4907]: E1129 14:53:30.551870 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="424d2988-9e7a-460f-b621-8268a86daaa5" containerName="nova-cell1-conductor-db-sync" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.551886 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="424d2988-9e7a-460f-b621-8268a86daaa5" containerName="nova-cell1-conductor-db-sync" Nov 29 14:53:30 crc kubenswrapper[4907]: E1129 14:53:30.551924 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d7d806d-53d1-4bdb-8469-0ecd96a9896f" containerName="nova-scheduler-scheduler" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.551930 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d7d806d-53d1-4bdb-8469-0ecd96a9896f" containerName="nova-scheduler-scheduler" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.552150 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="424d2988-9e7a-460f-b621-8268a86daaa5" containerName="nova-cell1-conductor-db-sync" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.552187 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d7d806d-53d1-4bdb-8469-0ecd96a9896f" containerName="nova-scheduler-scheduler" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.553056 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.562534 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "424d2988-9e7a-460f-b621-8268a86daaa5" (UID: "424d2988-9e7a-460f-b621-8268a86daaa5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.568459 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.577581 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-config-data" (OuterVolumeSpecName: "config-data") pod "424d2988-9e7a-460f-b621-8268a86daaa5" (UID: "424d2988-9e7a-460f-b621-8268a86daaa5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.585224 4907 generic.go:334] "Generic (PLEG): container finished" podID="5a4137fc-d9f8-46ae-9740-cf388fcb54f1" containerID="f5fe780a64700774a76b6303d085ae8ab2b8872fcd14c24eb01a002df6dcf958" exitCode=0 Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.585426 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-56fkz" event={"ID":"5a4137fc-d9f8-46ae-9740-cf388fcb54f1","Type":"ContainerDied","Data":"f5fe780a64700774a76b6303d085ae8ab2b8872fcd14c24eb01a002df6dcf958"} Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.585503 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-56fkz" event={"ID":"5a4137fc-d9f8-46ae-9740-cf388fcb54f1","Type":"ContainerStarted","Data":"17a52127f36b52882a09332974de109ce78f2452963bb09d10ef061c88244300"} Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.587993 4907 generic.go:334] "Generic (PLEG): container finished" podID="bd168f22-6342-4cce-95b4-3793763c8b41" containerID="27ea6a2ac4f48b7ceb965a7dea9a7a669e9fb4b5a46bd4ed3b5bf057a4414d8c" exitCode=0 Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.588038 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0272-account-create-update-zc6hq" event={"ID":"bd168f22-6342-4cce-95b4-3793763c8b41","Type":"ContainerDied","Data":"27ea6a2ac4f48b7ceb965a7dea9a7a669e9fb4b5a46bd4ed3b5bf057a4414d8c"} Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.588063 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0272-account-create-update-zc6hq" event={"ID":"bd168f22-6342-4cce-95b4-3793763c8b41","Type":"ContainerStarted","Data":"839807cf8ea95234056f2a3a55dd1ae75d35b408946ba484ab6fef33000a7931"} Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.591556 4907 generic.go:334] "Generic (PLEG): container finished" podID="4d7d806d-53d1-4bdb-8469-0ecd96a9896f" containerID="dae4367e392f103ac2489ffa9ab5a315764e5b126883041024383b75f3284440" exitCode=0 Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.591628 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4d7d806d-53d1-4bdb-8469-0ecd96a9896f","Type":"ContainerDied","Data":"dae4367e392f103ac2489ffa9ab5a315764e5b126883041024383b75f3284440"} Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.591646 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4d7d806d-53d1-4bdb-8469-0ecd96a9896f","Type":"ContainerDied","Data":"a603c19c018b7d1b842354fdc73cd370d679bb24a62986d40b1ca8b64d3266f0"} Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.591663 4907 scope.go:117] "RemoveContainer" containerID="dae4367e392f103ac2489ffa9ab5a315764e5b126883041024383b75f3284440" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.591788 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.601871 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4v7qp" event={"ID":"424d2988-9e7a-460f-b621-8268a86daaa5","Type":"ContainerDied","Data":"a995595350ea220e8aa8a9afa6d4b11dee58a300d1d1f04382834b607f861927"} Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.601969 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a995595350ea220e8aa8a9afa6d4b11dee58a300d1d1f04382834b607f861927" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.602012 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d959bbb-e174-4315-935c-18f5cc65008c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3d959bbb-e174-4315-935c-18f5cc65008c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.602269 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhlrj\" (UniqueName: \"kubernetes.io/projected/3d959bbb-e174-4315-935c-18f5cc65008c-kube-api-access-rhlrj\") pod \"nova-cell1-conductor-0\" (UID: \"3d959bbb-e174-4315-935c-18f5cc65008c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.602403 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d959bbb-e174-4315-935c-18f5cc65008c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3d959bbb-e174-4315-935c-18f5cc65008c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.602595 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4v7qp" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.603199 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.603362 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.610508 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/424d2988-9e7a-460f-b621-8268a86daaa5-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.610623 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b9sm\" (UniqueName: \"kubernetes.io/projected/424d2988-9e7a-460f-b621-8268a86daaa5-kube-api-access-7b9sm\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.612695 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9wpf" event={"ID":"9bd02179-a472-4e40-bd4c-4c6cd9a5155d","Type":"ContainerStarted","Data":"2a647c7c8a5e16eac0b1901556f10923e979e36c40e51ec813c7ec7400d78489"} Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.612928 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="91243c38-88da-4f49-b96d-07d724fa8f03" containerName="nova-metadata-log" containerID="cri-o://9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2" gracePeriod=30 Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.613513 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="91243c38-88da-4f49-b96d-07d724fa8f03" containerName="nova-metadata-metadata" containerID="cri-o://02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b" gracePeriod=30 Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.654426 4907 scope.go:117] "RemoveContainer" containerID="dae4367e392f103ac2489ffa9ab5a315764e5b126883041024383b75f3284440" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.654675 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 14:53:30 crc kubenswrapper[4907]: E1129 14:53:30.656982 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dae4367e392f103ac2489ffa9ab5a315764e5b126883041024383b75f3284440\": container with ID starting with dae4367e392f103ac2489ffa9ab5a315764e5b126883041024383b75f3284440 not found: ID does not exist" containerID="dae4367e392f103ac2489ffa9ab5a315764e5b126883041024383b75f3284440" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.657009 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dae4367e392f103ac2489ffa9ab5a315764e5b126883041024383b75f3284440"} err="failed to get container status \"dae4367e392f103ac2489ffa9ab5a315764e5b126883041024383b75f3284440\": rpc error: code = NotFound desc = could not find container \"dae4367e392f103ac2489ffa9ab5a315764e5b126883041024383b75f3284440\": container with ID starting with dae4367e392f103ac2489ffa9ab5a315764e5b126883041024383b75f3284440 not found: ID does not exist" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.690997 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.712780 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d959bbb-e174-4315-935c-18f5cc65008c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3d959bbb-e174-4315-935c-18f5cc65008c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.713173 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhlrj\" (UniqueName: \"kubernetes.io/projected/3d959bbb-e174-4315-935c-18f5cc65008c-kube-api-access-rhlrj\") pod \"nova-cell1-conductor-0\" (UID: \"3d959bbb-e174-4315-935c-18f5cc65008c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.713219 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d959bbb-e174-4315-935c-18f5cc65008c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3d959bbb-e174-4315-935c-18f5cc65008c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.717190 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d959bbb-e174-4315-935c-18f5cc65008c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3d959bbb-e174-4315-935c-18f5cc65008c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.721187 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d959bbb-e174-4315-935c-18f5cc65008c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3d959bbb-e174-4315-935c-18f5cc65008c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.727300 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.729194 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.736668 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.737222 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhlrj\" (UniqueName: \"kubernetes.io/projected/3d959bbb-e174-4315-935c-18f5cc65008c-kube-api-access-rhlrj\") pod \"nova-cell1-conductor-0\" (UID: \"3d959bbb-e174-4315-935c-18f5cc65008c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.743075 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.814876 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66fa6e0b-4574-4377-ab95-8a8843ffde6d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"66fa6e0b-4574-4377-ab95-8a8843ffde6d\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.815048 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66fa6e0b-4574-4377-ab95-8a8843ffde6d-config-data\") pod \"nova-scheduler-0\" (UID: \"66fa6e0b-4574-4377-ab95-8a8843ffde6d\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.815167 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v4bf\" (UniqueName: \"kubernetes.io/projected/66fa6e0b-4574-4377-ab95-8a8843ffde6d-kube-api-access-5v4bf\") pod \"nova-scheduler-0\" (UID: \"66fa6e0b-4574-4377-ab95-8a8843ffde6d\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.894165 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.917844 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66fa6e0b-4574-4377-ab95-8a8843ffde6d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"66fa6e0b-4574-4377-ab95-8a8843ffde6d\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.917959 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66fa6e0b-4574-4377-ab95-8a8843ffde6d-config-data\") pod \"nova-scheduler-0\" (UID: \"66fa6e0b-4574-4377-ab95-8a8843ffde6d\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.918044 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v4bf\" (UniqueName: \"kubernetes.io/projected/66fa6e0b-4574-4377-ab95-8a8843ffde6d-kube-api-access-5v4bf\") pod \"nova-scheduler-0\" (UID: \"66fa6e0b-4574-4377-ab95-8a8843ffde6d\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.929323 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66fa6e0b-4574-4377-ab95-8a8843ffde6d-config-data\") pod \"nova-scheduler-0\" (UID: \"66fa6e0b-4574-4377-ab95-8a8843ffde6d\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.952774 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66fa6e0b-4574-4377-ab95-8a8843ffde6d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"66fa6e0b-4574-4377-ab95-8a8843ffde6d\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.954669 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.954721 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 14:53:30 crc kubenswrapper[4907]: I1129 14:53:30.956363 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v4bf\" (UniqueName: \"kubernetes.io/projected/66fa6e0b-4574-4377-ab95-8a8843ffde6d-kube-api-access-5v4bf\") pod \"nova-scheduler-0\" (UID: \"66fa6e0b-4574-4377-ab95-8a8843ffde6d\") " pod="openstack/nova-scheduler-0" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.059996 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.357785 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.430221 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtxrz\" (UniqueName: \"kubernetes.io/projected/91243c38-88da-4f49-b96d-07d724fa8f03-kube-api-access-xtxrz\") pod \"91243c38-88da-4f49-b96d-07d724fa8f03\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.430311 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-config-data\") pod \"91243c38-88da-4f49-b96d-07d724fa8f03\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.430636 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91243c38-88da-4f49-b96d-07d724fa8f03-logs\") pod \"91243c38-88da-4f49-b96d-07d724fa8f03\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.430667 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-combined-ca-bundle\") pod \"91243c38-88da-4f49-b96d-07d724fa8f03\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.430692 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-nova-metadata-tls-certs\") pod \"91243c38-88da-4f49-b96d-07d724fa8f03\" (UID: \"91243c38-88da-4f49-b96d-07d724fa8f03\") " Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.431023 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91243c38-88da-4f49-b96d-07d724fa8f03-logs" (OuterVolumeSpecName: "logs") pod "91243c38-88da-4f49-b96d-07d724fa8f03" (UID: "91243c38-88da-4f49-b96d-07d724fa8f03"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.431424 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91243c38-88da-4f49-b96d-07d724fa8f03-logs\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.436014 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91243c38-88da-4f49-b96d-07d724fa8f03-kube-api-access-xtxrz" (OuterVolumeSpecName: "kube-api-access-xtxrz") pod "91243c38-88da-4f49-b96d-07d724fa8f03" (UID: "91243c38-88da-4f49-b96d-07d724fa8f03"). InnerVolumeSpecName "kube-api-access-xtxrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.461676 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-config-data" (OuterVolumeSpecName: "config-data") pod "91243c38-88da-4f49-b96d-07d724fa8f03" (UID: "91243c38-88da-4f49-b96d-07d724fa8f03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.487718 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91243c38-88da-4f49-b96d-07d724fa8f03" (UID: "91243c38-88da-4f49-b96d-07d724fa8f03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.534692 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.534720 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtxrz\" (UniqueName: \"kubernetes.io/projected/91243c38-88da-4f49-b96d-07d724fa8f03-kube-api-access-xtxrz\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.534729 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.546049 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.546969 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "91243c38-88da-4f49-b96d-07d724fa8f03" (UID: "91243c38-88da-4f49-b96d-07d724fa8f03"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:31 crc kubenswrapper[4907]: W1129 14:53:31.553295 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d959bbb_e174_4315_935c_18f5cc65008c.slice/crio-99a83bd443331c2396937628477e9dcfde0e9bd5740c0c6dce78666f08668906 WatchSource:0}: Error finding container 99a83bd443331c2396937628477e9dcfde0e9bd5740c0c6dce78666f08668906: Status 404 returned error can't find the container with id 99a83bd443331c2396937628477e9dcfde0e9bd5740c0c6dce78666f08668906 Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.636685 4907 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/91243c38-88da-4f49-b96d-07d724fa8f03-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.705137 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.705270 4907 generic.go:334] "Generic (PLEG): container finished" podID="91243c38-88da-4f49-b96d-07d724fa8f03" containerID="02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b" exitCode=0 Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.705288 4907 generic.go:334] "Generic (PLEG): container finished" podID="91243c38-88da-4f49-b96d-07d724fa8f03" containerID="9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2" exitCode=143 Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.705320 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"91243c38-88da-4f49-b96d-07d724fa8f03","Type":"ContainerDied","Data":"02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b"} Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.705338 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"91243c38-88da-4f49-b96d-07d724fa8f03","Type":"ContainerDied","Data":"9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2"} Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.705352 4907 scope.go:117] "RemoveContainer" containerID="02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.705781 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"91243c38-88da-4f49-b96d-07d724fa8f03","Type":"ContainerDied","Data":"ef277806b9d6439e32d789fbe27a41d543bc2b0280a675e009628ae492951c28"} Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.719113 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3d959bbb-e174-4315-935c-18f5cc65008c","Type":"ContainerStarted","Data":"99a83bd443331c2396937628477e9dcfde0e9bd5740c0c6dce78666f08668906"} Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.731591 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.821313 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.832109 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.835096 4907 scope.go:117] "RemoveContainer" containerID="9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.871781 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:53:31 crc kubenswrapper[4907]: E1129 14:53:31.872917 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91243c38-88da-4f49-b96d-07d724fa8f03" containerName="nova-metadata-metadata" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.872935 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="91243c38-88da-4f49-b96d-07d724fa8f03" containerName="nova-metadata-metadata" Nov 29 14:53:31 crc kubenswrapper[4907]: E1129 14:53:31.873139 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91243c38-88da-4f49-b96d-07d724fa8f03" containerName="nova-metadata-log" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.873215 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="91243c38-88da-4f49-b96d-07d724fa8f03" containerName="nova-metadata-log" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.879794 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="91243c38-88da-4f49-b96d-07d724fa8f03" containerName="nova-metadata-metadata" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.880138 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="91243c38-88da-4f49-b96d-07d724fa8f03" containerName="nova-metadata-log" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.881640 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.889397 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.897940 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.898329 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.931995 4907 scope.go:117] "RemoveContainer" containerID="02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b" Nov 29 14:53:31 crc kubenswrapper[4907]: E1129 14:53:31.947613 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b\": container with ID starting with 02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b not found: ID does not exist" containerID="02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.947655 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b"} err="failed to get container status \"02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b\": rpc error: code = NotFound desc = could not find container \"02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b\": container with ID starting with 02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b not found: ID does not exist" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.947681 4907 scope.go:117] "RemoveContainer" containerID="9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2" Nov 29 14:53:31 crc kubenswrapper[4907]: E1129 14:53:31.952656 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2\": container with ID starting with 9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2 not found: ID does not exist" containerID="9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.952689 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2"} err="failed to get container status \"9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2\": rpc error: code = NotFound desc = could not find container \"9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2\": container with ID starting with 9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2 not found: ID does not exist" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.952714 4907 scope.go:117] "RemoveContainer" containerID="02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.954589 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b"} err="failed to get container status \"02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b\": rpc error: code = NotFound desc = could not find container \"02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b\": container with ID starting with 02ba91897c83263de912c2535255c962e52cfb03fab111e48af05507096a119b not found: ID does not exist" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.954960 4907 scope.go:117] "RemoveContainer" containerID="9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.959601 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2"} err="failed to get container status \"9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2\": rpc error: code = NotFound desc = could not find container \"9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2\": container with ID starting with 9152ccfa8b35b26bd2d0c988f5e02e4b564ab9be6e5b4870a2c36a930d073be2 not found: ID does not exist" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.976123 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " pod="openstack/nova-metadata-0" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.976173 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfkb6\" (UniqueName: \"kubernetes.io/projected/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-kube-api-access-sfkb6\") pod \"nova-metadata-0\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " pod="openstack/nova-metadata-0" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.976198 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-config-data\") pod \"nova-metadata-0\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " pod="openstack/nova-metadata-0" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.976232 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " pod="openstack/nova-metadata-0" Nov 29 14:53:31 crc kubenswrapper[4907]: I1129 14:53:31.976344 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-logs\") pod \"nova-metadata-0\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " pod="openstack/nova-metadata-0" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.078109 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-logs\") pod \"nova-metadata-0\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " pod="openstack/nova-metadata-0" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.078220 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " pod="openstack/nova-metadata-0" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.078248 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfkb6\" (UniqueName: \"kubernetes.io/projected/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-kube-api-access-sfkb6\") pod \"nova-metadata-0\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " pod="openstack/nova-metadata-0" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.078267 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-config-data\") pod \"nova-metadata-0\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " pod="openstack/nova-metadata-0" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.078301 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " pod="openstack/nova-metadata-0" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.080202 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-logs\") pod \"nova-metadata-0\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " pod="openstack/nova-metadata-0" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.083174 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " pod="openstack/nova-metadata-0" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.086910 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " pod="openstack/nova-metadata-0" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.087477 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-config-data\") pod \"nova-metadata-0\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " pod="openstack/nova-metadata-0" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.100799 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfkb6\" (UniqueName: \"kubernetes.io/projected/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-kube-api-access-sfkb6\") pod \"nova-metadata-0\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " pod="openstack/nova-metadata-0" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.305375 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.429142 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-56fkz" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.458142 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0272-account-create-update-zc6hq" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.493359 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a4137fc-d9f8-46ae-9740-cf388fcb54f1-operator-scripts\") pod \"5a4137fc-d9f8-46ae-9740-cf388fcb54f1\" (UID: \"5a4137fc-d9f8-46ae-9740-cf388fcb54f1\") " Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.493707 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhdzl\" (UniqueName: \"kubernetes.io/projected/5a4137fc-d9f8-46ae-9740-cf388fcb54f1-kube-api-access-fhdzl\") pod \"5a4137fc-d9f8-46ae-9740-cf388fcb54f1\" (UID: \"5a4137fc-d9f8-46ae-9740-cf388fcb54f1\") " Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.502193 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a4137fc-d9f8-46ae-9740-cf388fcb54f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5a4137fc-d9f8-46ae-9740-cf388fcb54f1" (UID: "5a4137fc-d9f8-46ae-9740-cf388fcb54f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.502877 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a4137fc-d9f8-46ae-9740-cf388fcb54f1-kube-api-access-fhdzl" (OuterVolumeSpecName: "kube-api-access-fhdzl") pod "5a4137fc-d9f8-46ae-9740-cf388fcb54f1" (UID: "5a4137fc-d9f8-46ae-9740-cf388fcb54f1"). InnerVolumeSpecName "kube-api-access-fhdzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.516275 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d7d806d-53d1-4bdb-8469-0ecd96a9896f" path="/var/lib/kubelet/pods/4d7d806d-53d1-4bdb-8469-0ecd96a9896f/volumes" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.516911 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91243c38-88da-4f49-b96d-07d724fa8f03" path="/var/lib/kubelet/pods/91243c38-88da-4f49-b96d-07d724fa8f03/volumes" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.597925 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd168f22-6342-4cce-95b4-3793763c8b41-operator-scripts\") pod \"bd168f22-6342-4cce-95b4-3793763c8b41\" (UID: \"bd168f22-6342-4cce-95b4-3793763c8b41\") " Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.598042 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw9q9\" (UniqueName: \"kubernetes.io/projected/bd168f22-6342-4cce-95b4-3793763c8b41-kube-api-access-vw9q9\") pod \"bd168f22-6342-4cce-95b4-3793763c8b41\" (UID: \"bd168f22-6342-4cce-95b4-3793763c8b41\") " Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.598563 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd168f22-6342-4cce-95b4-3793763c8b41-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd168f22-6342-4cce-95b4-3793763c8b41" (UID: "bd168f22-6342-4cce-95b4-3793763c8b41"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.602265 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhdzl\" (UniqueName: \"kubernetes.io/projected/5a4137fc-d9f8-46ae-9740-cf388fcb54f1-kube-api-access-fhdzl\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.602283 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a4137fc-d9f8-46ae-9740-cf388fcb54f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.602293 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd168f22-6342-4cce-95b4-3793763c8b41-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.605960 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd168f22-6342-4cce-95b4-3793763c8b41-kube-api-access-vw9q9" (OuterVolumeSpecName: "kube-api-access-vw9q9") pod "bd168f22-6342-4cce-95b4-3793763c8b41" (UID: "bd168f22-6342-4cce-95b4-3793763c8b41"). InnerVolumeSpecName "kube-api-access-vw9q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.704331 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw9q9\" (UniqueName: \"kubernetes.io/projected/bd168f22-6342-4cce-95b4-3793763c8b41-kube-api-access-vw9q9\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.737121 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"66fa6e0b-4574-4377-ab95-8a8843ffde6d","Type":"ContainerStarted","Data":"8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332"} Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.737179 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"66fa6e0b-4574-4377-ab95-8a8843ffde6d","Type":"ContainerStarted","Data":"27fcdf6c27bda289b87cd0e1baba1c1004ae62ebd0445fa5ea879f6d10b38a6f"} Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.752319 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9699098b-28d9-42d8-b43e-da4d4a655b8e","Type":"ContainerStarted","Data":"50cc74645f167661bd4039818293f5bf16182f90d9975b5bec439e13cdd0ebd2"} Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.753317 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.756253 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-56fkz" event={"ID":"5a4137fc-d9f8-46ae-9740-cf388fcb54f1","Type":"ContainerDied","Data":"17a52127f36b52882a09332974de109ce78f2452963bb09d10ef061c88244300"} Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.756277 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17a52127f36b52882a09332974de109ce78f2452963bb09d10ef061c88244300" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.756346 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-56fkz" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.761656 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0272-account-create-update-zc6hq" event={"ID":"bd168f22-6342-4cce-95b4-3793763c8b41","Type":"ContainerDied","Data":"839807cf8ea95234056f2a3a55dd1ae75d35b408946ba484ab6fef33000a7931"} Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.761681 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="839807cf8ea95234056f2a3a55dd1ae75d35b408946ba484ab6fef33000a7931" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.761756 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0272-account-create-update-zc6hq" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.780896 4907 generic.go:334] "Generic (PLEG): container finished" podID="9bd02179-a472-4e40-bd4c-4c6cd9a5155d" containerID="2a647c7c8a5e16eac0b1901556f10923e979e36c40e51ec813c7ec7400d78489" exitCode=0 Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.780971 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9wpf" event={"ID":"9bd02179-a472-4e40-bd4c-4c6cd9a5155d","Type":"ContainerDied","Data":"2a647c7c8a5e16eac0b1901556f10923e979e36c40e51ec813c7ec7400d78489"} Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.782890 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3d959bbb-e174-4315-935c-18f5cc65008c","Type":"ContainerStarted","Data":"eddb55a0151dd48213269fe9b1e7873b031c6743de2a63cbbc25e3632f35dc59"} Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.784149 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.798671 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.798632685 podStartE2EDuration="2.798632685s" podCreationTimestamp="2025-11-29 14:53:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:53:32.755585612 +0000 UTC m=+1510.742423264" watchObservedRunningTime="2025-11-29 14:53:32.798632685 +0000 UTC m=+1510.785470337" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.813834 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.508782793 podStartE2EDuration="7.813815163s" podCreationTimestamp="2025-11-29 14:53:25 +0000 UTC" firstStartedPulling="2025-11-29 14:53:26.39374499 +0000 UTC m=+1504.380582642" lastFinishedPulling="2025-11-29 14:53:30.69877736 +0000 UTC m=+1508.685615012" observedRunningTime="2025-11-29 14:53:32.776641835 +0000 UTC m=+1510.763479487" watchObservedRunningTime="2025-11-29 14:53:32.813815163 +0000 UTC m=+1510.800652815" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.828210 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.828190699 podStartE2EDuration="2.828190699s" podCreationTimestamp="2025-11-29 14:53:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:53:32.813522735 +0000 UTC m=+1510.800360387" watchObservedRunningTime="2025-11-29 14:53:32.828190699 +0000 UTC m=+1510.815028351" Nov 29 14:53:32 crc kubenswrapper[4907]: I1129 14:53:32.911481 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:53:33 crc kubenswrapper[4907]: I1129 14:53:33.793870 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"08c24285-2f83-4e3b-8c8d-acdb01ed50ee","Type":"ContainerStarted","Data":"79477305c12e66f134f5b1d2b4913a0624b38583e397d6115b04f407de3ddcd5"} Nov 29 14:53:33 crc kubenswrapper[4907]: I1129 14:53:33.795211 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"08c24285-2f83-4e3b-8c8d-acdb01ed50ee","Type":"ContainerStarted","Data":"ffcd0560c45bbb716a32a3c4359a98ad1ffb65f5233081566accebe75cd9b152"} Nov 29 14:53:33 crc kubenswrapper[4907]: I1129 14:53:33.795313 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"08c24285-2f83-4e3b-8c8d-acdb01ed50ee","Type":"ContainerStarted","Data":"563ba47f2ebd4876340b2883c24e0187454b525432811bff02781c397ab3be34"} Nov 29 14:53:33 crc kubenswrapper[4907]: I1129 14:53:33.798988 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9wpf" event={"ID":"9bd02179-a472-4e40-bd4c-4c6cd9a5155d","Type":"ContainerStarted","Data":"14e2593ab26c264f0fcdd1fe9cfff97f796cdb9b83f7fd9d90090a292ad0df74"} Nov 29 14:53:33 crc kubenswrapper[4907]: I1129 14:53:33.828870 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.828829311 podStartE2EDuration="2.828829311s" podCreationTimestamp="2025-11-29 14:53:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:53:33.828644006 +0000 UTC m=+1511.815481658" watchObservedRunningTime="2025-11-29 14:53:33.828829311 +0000 UTC m=+1511.815666963" Nov 29 14:53:33 crc kubenswrapper[4907]: I1129 14:53:33.863201 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s9wpf" podStartSLOduration=3.224672062 podStartE2EDuration="6.863182329s" podCreationTimestamp="2025-11-29 14:53:27 +0000 UTC" firstStartedPulling="2025-11-29 14:53:29.563256534 +0000 UTC m=+1507.550094186" lastFinishedPulling="2025-11-29 14:53:33.201766791 +0000 UTC m=+1511.188604453" observedRunningTime="2025-11-29 14:53:33.855809922 +0000 UTC m=+1511.842647564" watchObservedRunningTime="2025-11-29 14:53:33.863182329 +0000 UTC m=+1511.850019981" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.613593 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.783287 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcrgk\" (UniqueName: \"kubernetes.io/projected/2a663d98-2309-4412-a832-12e75ad4098a-kube-api-access-xcrgk\") pod \"2a663d98-2309-4412-a832-12e75ad4098a\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.783330 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a663d98-2309-4412-a832-12e75ad4098a-config-data\") pod \"2a663d98-2309-4412-a832-12e75ad4098a\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.783498 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a663d98-2309-4412-a832-12e75ad4098a-logs\") pod \"2a663d98-2309-4412-a832-12e75ad4098a\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.783576 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a663d98-2309-4412-a832-12e75ad4098a-combined-ca-bundle\") pod \"2a663d98-2309-4412-a832-12e75ad4098a\" (UID: \"2a663d98-2309-4412-a832-12e75ad4098a\") " Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.784005 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a663d98-2309-4412-a832-12e75ad4098a-logs" (OuterVolumeSpecName: "logs") pod "2a663d98-2309-4412-a832-12e75ad4098a" (UID: "2a663d98-2309-4412-a832-12e75ad4098a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.784497 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a663d98-2309-4412-a832-12e75ad4098a-logs\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.790587 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a663d98-2309-4412-a832-12e75ad4098a-kube-api-access-xcrgk" (OuterVolumeSpecName: "kube-api-access-xcrgk") pod "2a663d98-2309-4412-a832-12e75ad4098a" (UID: "2a663d98-2309-4412-a832-12e75ad4098a"). InnerVolumeSpecName "kube-api-access-xcrgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.817594 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a663d98-2309-4412-a832-12e75ad4098a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a663d98-2309-4412-a832-12e75ad4098a" (UID: "2a663d98-2309-4412-a832-12e75ad4098a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.820238 4907 generic.go:334] "Generic (PLEG): container finished" podID="2a663d98-2309-4412-a832-12e75ad4098a" containerID="29c0e97bf81479abf03b5f43b19de93c1d8a01ebb694e3925b5fb598faa3f8cf" exitCode=0 Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.820634 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a663d98-2309-4412-a832-12e75ad4098a","Type":"ContainerDied","Data":"29c0e97bf81479abf03b5f43b19de93c1d8a01ebb694e3925b5fb598faa3f8cf"} Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.820683 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a663d98-2309-4412-a832-12e75ad4098a","Type":"ContainerDied","Data":"6ad25f5eb25a071841decfd44028b4321de66b466706877fac102ddff8679c41"} Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.820730 4907 scope.go:117] "RemoveContainer" containerID="29c0e97bf81479abf03b5f43b19de93c1d8a01ebb694e3925b5fb598faa3f8cf" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.820957 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.857642 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a663d98-2309-4412-a832-12e75ad4098a-config-data" (OuterVolumeSpecName: "config-data") pod "2a663d98-2309-4412-a832-12e75ad4098a" (UID: "2a663d98-2309-4412-a832-12e75ad4098a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.863784 4907 scope.go:117] "RemoveContainer" containerID="68c2229aca716d602638c9a1c299e8b6da3fdbd6db6e60897ed73d5ac405899b" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.890336 4907 scope.go:117] "RemoveContainer" containerID="29c0e97bf81479abf03b5f43b19de93c1d8a01ebb694e3925b5fb598faa3f8cf" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.890581 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcrgk\" (UniqueName: \"kubernetes.io/projected/2a663d98-2309-4412-a832-12e75ad4098a-kube-api-access-xcrgk\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.890604 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a663d98-2309-4412-a832-12e75ad4098a-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.890615 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a663d98-2309-4412-a832-12e75ad4098a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:34 crc kubenswrapper[4907]: E1129 14:53:34.891825 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c0e97bf81479abf03b5f43b19de93c1d8a01ebb694e3925b5fb598faa3f8cf\": container with ID starting with 29c0e97bf81479abf03b5f43b19de93c1d8a01ebb694e3925b5fb598faa3f8cf not found: ID does not exist" containerID="29c0e97bf81479abf03b5f43b19de93c1d8a01ebb694e3925b5fb598faa3f8cf" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.891868 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c0e97bf81479abf03b5f43b19de93c1d8a01ebb694e3925b5fb598faa3f8cf"} err="failed to get container status \"29c0e97bf81479abf03b5f43b19de93c1d8a01ebb694e3925b5fb598faa3f8cf\": rpc error: code = NotFound desc = could not find container \"29c0e97bf81479abf03b5f43b19de93c1d8a01ebb694e3925b5fb598faa3f8cf\": container with ID starting with 29c0e97bf81479abf03b5f43b19de93c1d8a01ebb694e3925b5fb598faa3f8cf not found: ID does not exist" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.891898 4907 scope.go:117] "RemoveContainer" containerID="68c2229aca716d602638c9a1c299e8b6da3fdbd6db6e60897ed73d5ac405899b" Nov 29 14:53:34 crc kubenswrapper[4907]: E1129 14:53:34.892242 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c2229aca716d602638c9a1c299e8b6da3fdbd6db6e60897ed73d5ac405899b\": container with ID starting with 68c2229aca716d602638c9a1c299e8b6da3fdbd6db6e60897ed73d5ac405899b not found: ID does not exist" containerID="68c2229aca716d602638c9a1c299e8b6da3fdbd6db6e60897ed73d5ac405899b" Nov 29 14:53:34 crc kubenswrapper[4907]: I1129 14:53:34.892285 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c2229aca716d602638c9a1c299e8b6da3fdbd6db6e60897ed73d5ac405899b"} err="failed to get container status \"68c2229aca716d602638c9a1c299e8b6da3fdbd6db6e60897ed73d5ac405899b\": rpc error: code = NotFound desc = could not find container \"68c2229aca716d602638c9a1c299e8b6da3fdbd6db6e60897ed73d5ac405899b\": container with ID starting with 68c2229aca716d602638c9a1c299e8b6da3fdbd6db6e60897ed73d5ac405899b not found: ID does not exist" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.160345 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.194815 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.208498 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 14:53:35 crc kubenswrapper[4907]: E1129 14:53:35.209051 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a4137fc-d9f8-46ae-9740-cf388fcb54f1" containerName="mariadb-database-create" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.209071 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a4137fc-d9f8-46ae-9740-cf388fcb54f1" containerName="mariadb-database-create" Nov 29 14:53:35 crc kubenswrapper[4907]: E1129 14:53:35.209113 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a663d98-2309-4412-a832-12e75ad4098a" containerName="nova-api-log" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.209121 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a663d98-2309-4412-a832-12e75ad4098a" containerName="nova-api-log" Nov 29 14:53:35 crc kubenswrapper[4907]: E1129 14:53:35.209142 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd168f22-6342-4cce-95b4-3793763c8b41" containerName="mariadb-account-create-update" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.209149 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd168f22-6342-4cce-95b4-3793763c8b41" containerName="mariadb-account-create-update" Nov 29 14:53:35 crc kubenswrapper[4907]: E1129 14:53:35.209174 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a663d98-2309-4412-a832-12e75ad4098a" containerName="nova-api-api" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.209179 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a663d98-2309-4412-a832-12e75ad4098a" containerName="nova-api-api" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.209420 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a663d98-2309-4412-a832-12e75ad4098a" containerName="nova-api-api" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.209450 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd168f22-6342-4cce-95b4-3793763c8b41" containerName="mariadb-account-create-update" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.209475 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a663d98-2309-4412-a832-12e75ad4098a" containerName="nova-api-log" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.209486 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a4137fc-d9f8-46ae-9740-cf388fcb54f1" containerName="mariadb-database-create" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.211012 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.214001 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.240330 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.403353 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4eabed3-66b5-45e3-a2d6-a174b4064289-config-data\") pod \"nova-api-0\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " pod="openstack/nova-api-0" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.403501 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjzqp\" (UniqueName: \"kubernetes.io/projected/d4eabed3-66b5-45e3-a2d6-a174b4064289-kube-api-access-kjzqp\") pod \"nova-api-0\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " pod="openstack/nova-api-0" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.403672 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4eabed3-66b5-45e3-a2d6-a174b4064289-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " pod="openstack/nova-api-0" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.403728 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4eabed3-66b5-45e3-a2d6-a174b4064289-logs\") pod \"nova-api-0\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " pod="openstack/nova-api-0" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.505797 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4eabed3-66b5-45e3-a2d6-a174b4064289-config-data\") pod \"nova-api-0\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " pod="openstack/nova-api-0" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.505852 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjzqp\" (UniqueName: \"kubernetes.io/projected/d4eabed3-66b5-45e3-a2d6-a174b4064289-kube-api-access-kjzqp\") pod \"nova-api-0\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " pod="openstack/nova-api-0" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.506165 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4eabed3-66b5-45e3-a2d6-a174b4064289-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " pod="openstack/nova-api-0" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.506188 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4eabed3-66b5-45e3-a2d6-a174b4064289-logs\") pod \"nova-api-0\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " pod="openstack/nova-api-0" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.506651 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4eabed3-66b5-45e3-a2d6-a174b4064289-logs\") pod \"nova-api-0\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " pod="openstack/nova-api-0" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.513174 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4eabed3-66b5-45e3-a2d6-a174b4064289-config-data\") pod \"nova-api-0\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " pod="openstack/nova-api-0" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.513662 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4eabed3-66b5-45e3-a2d6-a174b4064289-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " pod="openstack/nova-api-0" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.546985 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjzqp\" (UniqueName: \"kubernetes.io/projected/d4eabed3-66b5-45e3-a2d6-a174b4064289-kube-api-access-kjzqp\") pod \"nova-api-0\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " pod="openstack/nova-api-0" Nov 29 14:53:35 crc kubenswrapper[4907]: I1129 14:53:35.842982 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.060409 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.407786 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.492403 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a663d98-2309-4412-a832-12e75ad4098a" path="/var/lib/kubelet/pods/2a663d98-2309-4412-a832-12e75ad4098a/volumes" Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.538475 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pbssn"] Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.557351 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbssn"] Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.557479 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.753179 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6b5c\" (UniqueName: \"kubernetes.io/projected/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-kube-api-access-k6b5c\") pod \"redhat-marketplace-pbssn\" (UID: \"2bf970f2-06d2-43be-bee5-4bbcd0a147d5\") " pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.753418 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-utilities\") pod \"redhat-marketplace-pbssn\" (UID: \"2bf970f2-06d2-43be-bee5-4bbcd0a147d5\") " pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.753636 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-catalog-content\") pod \"redhat-marketplace-pbssn\" (UID: \"2bf970f2-06d2-43be-bee5-4bbcd0a147d5\") " pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.848181 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4eabed3-66b5-45e3-a2d6-a174b4064289","Type":"ContainerStarted","Data":"64617b44e3d12fb59fe2b9c108f1791a9cb5fa254f91b00d109133fb3b35873b"} Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.849614 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4eabed3-66b5-45e3-a2d6-a174b4064289","Type":"ContainerStarted","Data":"b9053da85f02158e838c16254c68717d2925b3e504c53d53048532c49810988a"} Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.856065 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6b5c\" (UniqueName: \"kubernetes.io/projected/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-kube-api-access-k6b5c\") pod \"redhat-marketplace-pbssn\" (UID: \"2bf970f2-06d2-43be-bee5-4bbcd0a147d5\") " pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.856320 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-utilities\") pod \"redhat-marketplace-pbssn\" (UID: \"2bf970f2-06d2-43be-bee5-4bbcd0a147d5\") " pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.856680 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-catalog-content\") pod \"redhat-marketplace-pbssn\" (UID: \"2bf970f2-06d2-43be-bee5-4bbcd0a147d5\") " pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.857331 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-catalog-content\") pod \"redhat-marketplace-pbssn\" (UID: \"2bf970f2-06d2-43be-bee5-4bbcd0a147d5\") " pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.857345 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-utilities\") pod \"redhat-marketplace-pbssn\" (UID: \"2bf970f2-06d2-43be-bee5-4bbcd0a147d5\") " pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.888327 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6b5c\" (UniqueName: \"kubernetes.io/projected/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-kube-api-access-k6b5c\") pod \"redhat-marketplace-pbssn\" (UID: \"2bf970f2-06d2-43be-bee5-4bbcd0a147d5\") " pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:36 crc kubenswrapper[4907]: I1129 14:53:36.904144 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:37 crc kubenswrapper[4907]: I1129 14:53:37.306759 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 14:53:37 crc kubenswrapper[4907]: I1129 14:53:37.307153 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 14:53:37 crc kubenswrapper[4907]: I1129 14:53:37.414310 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbssn"] Nov 29 14:53:37 crc kubenswrapper[4907]: I1129 14:53:37.665090 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:37 crc kubenswrapper[4907]: I1129 14:53:37.665348 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:37 crc kubenswrapper[4907]: I1129 14:53:37.864661 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4eabed3-66b5-45e3-a2d6-a174b4064289","Type":"ContainerStarted","Data":"b19b6c34634174b118f1075cab767e2eee77633528f9c9a2cdb802987d191500"} Nov 29 14:53:37 crc kubenswrapper[4907]: I1129 14:53:37.875772 4907 generic.go:334] "Generic (PLEG): container finished" podID="2bf970f2-06d2-43be-bee5-4bbcd0a147d5" containerID="28e4c8a11b831f014217af0e67c5ccf2a84059a63842656f4cea02d4281e098f" exitCode=0 Nov 29 14:53:37 crc kubenswrapper[4907]: I1129 14:53:37.875818 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbssn" event={"ID":"2bf970f2-06d2-43be-bee5-4bbcd0a147d5","Type":"ContainerDied","Data":"28e4c8a11b831f014217af0e67c5ccf2a84059a63842656f4cea02d4281e098f"} Nov 29 14:53:37 crc kubenswrapper[4907]: I1129 14:53:37.875844 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbssn" event={"ID":"2bf970f2-06d2-43be-bee5-4bbcd0a147d5","Type":"ContainerStarted","Data":"adec4d801ab97c44a9cfd01063ee39638b57959b3ec6ee0d63c434687bbb670f"} Nov 29 14:53:37 crc kubenswrapper[4907]: I1129 14:53:37.896816 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.896791816 podStartE2EDuration="2.896791816s" podCreationTimestamp="2025-11-29 14:53:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:53:37.88628639 +0000 UTC m=+1515.873124042" watchObservedRunningTime="2025-11-29 14:53:37.896791816 +0000 UTC m=+1515.883629468" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.591398 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-47w2x"] Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.601236 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.603660 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.607139 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-47w2x"] Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.607240 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.607628 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.608384 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-b88t7" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.609335 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-config-data\") pod \"aodh-db-sync-47w2x\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.609421 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-scripts\") pod \"aodh-db-sync-47w2x\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.609677 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-combined-ca-bundle\") pod \"aodh-db-sync-47w2x\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.609705 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c62s4\" (UniqueName: \"kubernetes.io/projected/22a1f021-2c7f-497d-9f2e-f64cefe8822d-kube-api-access-c62s4\") pod \"aodh-db-sync-47w2x\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.711562 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-combined-ca-bundle\") pod \"aodh-db-sync-47w2x\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.711839 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c62s4\" (UniqueName: \"kubernetes.io/projected/22a1f021-2c7f-497d-9f2e-f64cefe8822d-kube-api-access-c62s4\") pod \"aodh-db-sync-47w2x\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.711885 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-config-data\") pod \"aodh-db-sync-47w2x\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.711930 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-scripts\") pod \"aodh-db-sync-47w2x\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.718226 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-scripts\") pod \"aodh-db-sync-47w2x\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.719891 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-config-data\") pod \"aodh-db-sync-47w2x\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.723676 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-combined-ca-bundle\") pod \"aodh-db-sync-47w2x\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.743557 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-s9wpf" podUID="9bd02179-a472-4e40-bd4c-4c6cd9a5155d" containerName="registry-server" probeResult="failure" output=< Nov 29 14:53:38 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 14:53:38 crc kubenswrapper[4907]: > Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.753055 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c62s4\" (UniqueName: \"kubernetes.io/projected/22a1f021-2c7f-497d-9f2e-f64cefe8822d-kube-api-access-c62s4\") pod \"aodh-db-sync-47w2x\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:38 crc kubenswrapper[4907]: I1129 14:53:38.929700 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:39 crc kubenswrapper[4907]: I1129 14:53:39.473923 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-47w2x"] Nov 29 14:53:39 crc kubenswrapper[4907]: W1129 14:53:39.476756 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22a1f021_2c7f_497d_9f2e_f64cefe8822d.slice/crio-b5f20e2b0569f2339658a77e5f0f80c8f5c6d3edecb5e34536b6bc15a63fec3c WatchSource:0}: Error finding container b5f20e2b0569f2339658a77e5f0f80c8f5c6d3edecb5e34536b6bc15a63fec3c: Status 404 returned error can't find the container with id b5f20e2b0569f2339658a77e5f0f80c8f5c6d3edecb5e34536b6bc15a63fec3c Nov 29 14:53:39 crc kubenswrapper[4907]: I1129 14:53:39.903378 4907 generic.go:334] "Generic (PLEG): container finished" podID="2bf970f2-06d2-43be-bee5-4bbcd0a147d5" containerID="93c9bba0c0d51ef719ca1bffed281a10e629395f8b1280b520a4230d508e0f6e" exitCode=0 Nov 29 14:53:39 crc kubenswrapper[4907]: I1129 14:53:39.903435 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbssn" event={"ID":"2bf970f2-06d2-43be-bee5-4bbcd0a147d5","Type":"ContainerDied","Data":"93c9bba0c0d51ef719ca1bffed281a10e629395f8b1280b520a4230d508e0f6e"} Nov 29 14:53:39 crc kubenswrapper[4907]: I1129 14:53:39.916363 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-47w2x" event={"ID":"22a1f021-2c7f-497d-9f2e-f64cefe8822d","Type":"ContainerStarted","Data":"b5f20e2b0569f2339658a77e5f0f80c8f5c6d3edecb5e34536b6bc15a63fec3c"} Nov 29 14:53:40 crc kubenswrapper[4907]: I1129 14:53:40.929778 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 29 14:53:40 crc kubenswrapper[4907]: I1129 14:53:40.932799 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbssn" event={"ID":"2bf970f2-06d2-43be-bee5-4bbcd0a147d5","Type":"ContainerStarted","Data":"71e11f905446e057e462c4b00c12145e5d264802f906bdf49264c5cad7cbd767"} Nov 29 14:53:41 crc kubenswrapper[4907]: I1129 14:53:41.060561 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 29 14:53:41 crc kubenswrapper[4907]: I1129 14:53:41.098024 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 29 14:53:41 crc kubenswrapper[4907]: I1129 14:53:41.123275 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pbssn" podStartSLOduration=2.647129863 podStartE2EDuration="5.123246896s" podCreationTimestamp="2025-11-29 14:53:36 +0000 UTC" firstStartedPulling="2025-11-29 14:53:37.877800531 +0000 UTC m=+1515.864638183" lastFinishedPulling="2025-11-29 14:53:40.353917564 +0000 UTC m=+1518.340755216" observedRunningTime="2025-11-29 14:53:40.982805636 +0000 UTC m=+1518.969643288" watchObservedRunningTime="2025-11-29 14:53:41.123246896 +0000 UTC m=+1519.110084548" Nov 29 14:53:41 crc kubenswrapper[4907]: I1129 14:53:41.985327 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 29 14:53:42 crc kubenswrapper[4907]: I1129 14:53:42.306256 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 14:53:42 crc kubenswrapper[4907]: I1129 14:53:42.306590 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 14:53:43 crc kubenswrapper[4907]: I1129 14:53:43.321581 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="08c24285-2f83-4e3b-8c8d-acdb01ed50ee" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.243:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 14:53:43 crc kubenswrapper[4907]: I1129 14:53:43.321657 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="08c24285-2f83-4e3b-8c8d-acdb01ed50ee" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.243:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 14:53:45 crc kubenswrapper[4907]: I1129 14:53:45.844779 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 14:53:45 crc kubenswrapper[4907]: I1129 14:53:45.845260 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 14:53:45 crc kubenswrapper[4907]: I1129 14:53:45.994296 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-47w2x" event={"ID":"22a1f021-2c7f-497d-9f2e-f64cefe8822d","Type":"ContainerStarted","Data":"13f30d1a8b7841519241bf7e5ac0d1e54fcb9641ec7f4ffb445550a209028be1"} Nov 29 14:53:46 crc kubenswrapper[4907]: I1129 14:53:46.012112 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-47w2x" podStartSLOduration=2.099700538 podStartE2EDuration="8.012095297s" podCreationTimestamp="2025-11-29 14:53:38 +0000 UTC" firstStartedPulling="2025-11-29 14:53:39.478821542 +0000 UTC m=+1517.465659194" lastFinishedPulling="2025-11-29 14:53:45.391216281 +0000 UTC m=+1523.378053953" observedRunningTime="2025-11-29 14:53:46.008587838 +0000 UTC m=+1523.995425490" watchObservedRunningTime="2025-11-29 14:53:46.012095297 +0000 UTC m=+1523.998932949" Nov 29 14:53:46 crc kubenswrapper[4907]: I1129 14:53:46.905187 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:46 crc kubenswrapper[4907]: I1129 14:53:46.905243 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:46 crc kubenswrapper[4907]: I1129 14:53:46.927699 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d4eabed3-66b5-45e3-a2d6-a174b4064289" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.244:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 14:53:46 crc kubenswrapper[4907]: I1129 14:53:46.928057 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d4eabed3-66b5-45e3-a2d6-a174b4064289" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.244:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 14:53:46 crc kubenswrapper[4907]: I1129 14:53:46.992350 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:47 crc kubenswrapper[4907]: I1129 14:53:47.061803 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:47 crc kubenswrapper[4907]: I1129 14:53:47.233063 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbssn"] Nov 29 14:53:47 crc kubenswrapper[4907]: I1129 14:53:47.730310 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:47 crc kubenswrapper[4907]: I1129 14:53:47.793017 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:48 crc kubenswrapper[4907]: I1129 14:53:48.013960 4907 generic.go:334] "Generic (PLEG): container finished" podID="22a1f021-2c7f-497d-9f2e-f64cefe8822d" containerID="13f30d1a8b7841519241bf7e5ac0d1e54fcb9641ec7f4ffb445550a209028be1" exitCode=0 Nov 29 14:53:48 crc kubenswrapper[4907]: I1129 14:53:48.014051 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-47w2x" event={"ID":"22a1f021-2c7f-497d-9f2e-f64cefe8822d","Type":"ContainerDied","Data":"13f30d1a8b7841519241bf7e5ac0d1e54fcb9641ec7f4ffb445550a209028be1"} Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.023597 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pbssn" podUID="2bf970f2-06d2-43be-bee5-4bbcd0a147d5" containerName="registry-server" containerID="cri-o://71e11f905446e057e462c4b00c12145e5d264802f906bdf49264c5cad7cbd767" gracePeriod=2 Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.582605 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.591554 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.607977 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-config-data\") pod \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.608092 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6b5c\" (UniqueName: \"kubernetes.io/projected/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-kube-api-access-k6b5c\") pod \"2bf970f2-06d2-43be-bee5-4bbcd0a147d5\" (UID: \"2bf970f2-06d2-43be-bee5-4bbcd0a147d5\") " Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.608185 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-utilities\") pod \"2bf970f2-06d2-43be-bee5-4bbcd0a147d5\" (UID: \"2bf970f2-06d2-43be-bee5-4bbcd0a147d5\") " Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.608218 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c62s4\" (UniqueName: \"kubernetes.io/projected/22a1f021-2c7f-497d-9f2e-f64cefe8822d-kube-api-access-c62s4\") pod \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.608273 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-combined-ca-bundle\") pod \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.608364 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-catalog-content\") pod \"2bf970f2-06d2-43be-bee5-4bbcd0a147d5\" (UID: \"2bf970f2-06d2-43be-bee5-4bbcd0a147d5\") " Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.608452 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-scripts\") pod \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\" (UID: \"22a1f021-2c7f-497d-9f2e-f64cefe8822d\") " Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.609153 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-utilities" (OuterVolumeSpecName: "utilities") pod "2bf970f2-06d2-43be-bee5-4bbcd0a147d5" (UID: "2bf970f2-06d2-43be-bee5-4bbcd0a147d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.609704 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.613836 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-kube-api-access-k6b5c" (OuterVolumeSpecName: "kube-api-access-k6b5c") pod "2bf970f2-06d2-43be-bee5-4bbcd0a147d5" (UID: "2bf970f2-06d2-43be-bee5-4bbcd0a147d5"). InnerVolumeSpecName "kube-api-access-k6b5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.616686 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-scripts" (OuterVolumeSpecName: "scripts") pod "22a1f021-2c7f-497d-9f2e-f64cefe8822d" (UID: "22a1f021-2c7f-497d-9f2e-f64cefe8822d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.621474 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22a1f021-2c7f-497d-9f2e-f64cefe8822d-kube-api-access-c62s4" (OuterVolumeSpecName: "kube-api-access-c62s4") pod "22a1f021-2c7f-497d-9f2e-f64cefe8822d" (UID: "22a1f021-2c7f-497d-9f2e-f64cefe8822d"). InnerVolumeSpecName "kube-api-access-c62s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.636665 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bf970f2-06d2-43be-bee5-4bbcd0a147d5" (UID: "2bf970f2-06d2-43be-bee5-4bbcd0a147d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.650739 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-config-data" (OuterVolumeSpecName: "config-data") pod "22a1f021-2c7f-497d-9f2e-f64cefe8822d" (UID: "22a1f021-2c7f-497d-9f2e-f64cefe8822d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.676572 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s9wpf"] Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.676820 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22a1f021-2c7f-497d-9f2e-f64cefe8822d" (UID: "22a1f021-2c7f-497d-9f2e-f64cefe8822d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.676967 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s9wpf" podUID="9bd02179-a472-4e40-bd4c-4c6cd9a5155d" containerName="registry-server" containerID="cri-o://14e2593ab26c264f0fcdd1fe9cfff97f796cdb9b83f7fd9d90090a292ad0df74" gracePeriod=2 Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.713635 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.713677 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.713690 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.713703 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6b5c\" (UniqueName: \"kubernetes.io/projected/2bf970f2-06d2-43be-bee5-4bbcd0a147d5-kube-api-access-k6b5c\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.713717 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c62s4\" (UniqueName: \"kubernetes.io/projected/22a1f021-2c7f-497d-9f2e-f64cefe8822d-kube-api-access-c62s4\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:49 crc kubenswrapper[4907]: I1129 14:53:49.713729 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22a1f021-2c7f-497d-9f2e-f64cefe8822d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.045128 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-47w2x" event={"ID":"22a1f021-2c7f-497d-9f2e-f64cefe8822d","Type":"ContainerDied","Data":"b5f20e2b0569f2339658a77e5f0f80c8f5c6d3edecb5e34536b6bc15a63fec3c"} Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.045531 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5f20e2b0569f2339658a77e5f0f80c8f5c6d3edecb5e34536b6bc15a63fec3c" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.045607 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-47w2x" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.055288 4907 generic.go:334] "Generic (PLEG): container finished" podID="2bf970f2-06d2-43be-bee5-4bbcd0a147d5" containerID="71e11f905446e057e462c4b00c12145e5d264802f906bdf49264c5cad7cbd767" exitCode=0 Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.055376 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbssn" event={"ID":"2bf970f2-06d2-43be-bee5-4bbcd0a147d5","Type":"ContainerDied","Data":"71e11f905446e057e462c4b00c12145e5d264802f906bdf49264c5cad7cbd767"} Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.055420 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbssn" event={"ID":"2bf970f2-06d2-43be-bee5-4bbcd0a147d5","Type":"ContainerDied","Data":"adec4d801ab97c44a9cfd01063ee39638b57959b3ec6ee0d63c434687bbb670f"} Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.055467 4907 scope.go:117] "RemoveContainer" containerID="71e11f905446e057e462c4b00c12145e5d264802f906bdf49264c5cad7cbd767" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.055638 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbssn" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.062994 4907 generic.go:334] "Generic (PLEG): container finished" podID="9bd02179-a472-4e40-bd4c-4c6cd9a5155d" containerID="14e2593ab26c264f0fcdd1fe9cfff97f796cdb9b83f7fd9d90090a292ad0df74" exitCode=0 Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.063054 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9wpf" event={"ID":"9bd02179-a472-4e40-bd4c-4c6cd9a5155d","Type":"ContainerDied","Data":"14e2593ab26c264f0fcdd1fe9cfff97f796cdb9b83f7fd9d90090a292ad0df74"} Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.110214 4907 scope.go:117] "RemoveContainer" containerID="93c9bba0c0d51ef719ca1bffed281a10e629395f8b1280b520a4230d508e0f6e" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.119235 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbssn"] Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.143945 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbssn"] Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.145966 4907 scope.go:117] "RemoveContainer" containerID="28e4c8a11b831f014217af0e67c5ccf2a84059a63842656f4cea02d4281e098f" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.275916 4907 scope.go:117] "RemoveContainer" containerID="71e11f905446e057e462c4b00c12145e5d264802f906bdf49264c5cad7cbd767" Nov 29 14:53:50 crc kubenswrapper[4907]: E1129 14:53:50.281414 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71e11f905446e057e462c4b00c12145e5d264802f906bdf49264c5cad7cbd767\": container with ID starting with 71e11f905446e057e462c4b00c12145e5d264802f906bdf49264c5cad7cbd767 not found: ID does not exist" containerID="71e11f905446e057e462c4b00c12145e5d264802f906bdf49264c5cad7cbd767" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.281729 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e11f905446e057e462c4b00c12145e5d264802f906bdf49264c5cad7cbd767"} err="failed to get container status \"71e11f905446e057e462c4b00c12145e5d264802f906bdf49264c5cad7cbd767\": rpc error: code = NotFound desc = could not find container \"71e11f905446e057e462c4b00c12145e5d264802f906bdf49264c5cad7cbd767\": container with ID starting with 71e11f905446e057e462c4b00c12145e5d264802f906bdf49264c5cad7cbd767 not found: ID does not exist" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.281763 4907 scope.go:117] "RemoveContainer" containerID="93c9bba0c0d51ef719ca1bffed281a10e629395f8b1280b520a4230d508e0f6e" Nov 29 14:53:50 crc kubenswrapper[4907]: E1129 14:53:50.283802 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93c9bba0c0d51ef719ca1bffed281a10e629395f8b1280b520a4230d508e0f6e\": container with ID starting with 93c9bba0c0d51ef719ca1bffed281a10e629395f8b1280b520a4230d508e0f6e not found: ID does not exist" containerID="93c9bba0c0d51ef719ca1bffed281a10e629395f8b1280b520a4230d508e0f6e" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.283833 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c9bba0c0d51ef719ca1bffed281a10e629395f8b1280b520a4230d508e0f6e"} err="failed to get container status \"93c9bba0c0d51ef719ca1bffed281a10e629395f8b1280b520a4230d508e0f6e\": rpc error: code = NotFound desc = could not find container \"93c9bba0c0d51ef719ca1bffed281a10e629395f8b1280b520a4230d508e0f6e\": container with ID starting with 93c9bba0c0d51ef719ca1bffed281a10e629395f8b1280b520a4230d508e0f6e not found: ID does not exist" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.283854 4907 scope.go:117] "RemoveContainer" containerID="28e4c8a11b831f014217af0e67c5ccf2a84059a63842656f4cea02d4281e098f" Nov 29 14:53:50 crc kubenswrapper[4907]: E1129 14:53:50.284480 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28e4c8a11b831f014217af0e67c5ccf2a84059a63842656f4cea02d4281e098f\": container with ID starting with 28e4c8a11b831f014217af0e67c5ccf2a84059a63842656f4cea02d4281e098f not found: ID does not exist" containerID="28e4c8a11b831f014217af0e67c5ccf2a84059a63842656f4cea02d4281e098f" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.284517 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28e4c8a11b831f014217af0e67c5ccf2a84059a63842656f4cea02d4281e098f"} err="failed to get container status \"28e4c8a11b831f014217af0e67c5ccf2a84059a63842656f4cea02d4281e098f\": rpc error: code = NotFound desc = could not find container \"28e4c8a11b831f014217af0e67c5ccf2a84059a63842656f4cea02d4281e098f\": container with ID starting with 28e4c8a11b831f014217af0e67c5ccf2a84059a63842656f4cea02d4281e098f not found: ID does not exist" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.299608 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.330554 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl885\" (UniqueName: \"kubernetes.io/projected/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-kube-api-access-kl885\") pod \"9bd02179-a472-4e40-bd4c-4c6cd9a5155d\" (UID: \"9bd02179-a472-4e40-bd4c-4c6cd9a5155d\") " Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.330598 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-catalog-content\") pod \"9bd02179-a472-4e40-bd4c-4c6cd9a5155d\" (UID: \"9bd02179-a472-4e40-bd4c-4c6cd9a5155d\") " Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.330742 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-utilities\") pod \"9bd02179-a472-4e40-bd4c-4c6cd9a5155d\" (UID: \"9bd02179-a472-4e40-bd4c-4c6cd9a5155d\") " Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.331210 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-utilities" (OuterVolumeSpecName: "utilities") pod "9bd02179-a472-4e40-bd4c-4c6cd9a5155d" (UID: "9bd02179-a472-4e40-bd4c-4c6cd9a5155d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.335449 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-kube-api-access-kl885" (OuterVolumeSpecName: "kube-api-access-kl885") pod "9bd02179-a472-4e40-bd4c-4c6cd9a5155d" (UID: "9bd02179-a472-4e40-bd4c-4c6cd9a5155d"). InnerVolumeSpecName "kube-api-access-kl885". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.381396 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9bd02179-a472-4e40-bd4c-4c6cd9a5155d" (UID: "9bd02179-a472-4e40-bd4c-4c6cd9a5155d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.433081 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl885\" (UniqueName: \"kubernetes.io/projected/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-kube-api-access-kl885\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.433119 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.433135 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bd02179-a472-4e40-bd4c-4c6cd9a5155d-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:50 crc kubenswrapper[4907]: I1129 14:53:50.498351 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bf970f2-06d2-43be-bee5-4bbcd0a147d5" path="/var/lib/kubelet/pods/2bf970f2-06d2-43be-bee5-4bbcd0a147d5/volumes" Nov 29 14:53:51 crc kubenswrapper[4907]: I1129 14:53:51.085941 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9wpf" event={"ID":"9bd02179-a472-4e40-bd4c-4c6cd9a5155d","Type":"ContainerDied","Data":"161a89a75134649622dead331100b40eb963cba4c902ee5731099e61f129cde4"} Nov 29 14:53:51 crc kubenswrapper[4907]: I1129 14:53:51.085976 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9wpf" Nov 29 14:53:51 crc kubenswrapper[4907]: I1129 14:53:51.086014 4907 scope.go:117] "RemoveContainer" containerID="14e2593ab26c264f0fcdd1fe9cfff97f796cdb9b83f7fd9d90090a292ad0df74" Nov 29 14:53:51 crc kubenswrapper[4907]: I1129 14:53:51.125471 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s9wpf"] Nov 29 14:53:51 crc kubenswrapper[4907]: I1129 14:53:51.143949 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s9wpf"] Nov 29 14:53:51 crc kubenswrapper[4907]: I1129 14:53:51.146639 4907 scope.go:117] "RemoveContainer" containerID="2a647c7c8a5e16eac0b1901556f10923e979e36c40e51ec813c7ec7400d78489" Nov 29 14:53:51 crc kubenswrapper[4907]: I1129 14:53:51.174763 4907 scope.go:117] "RemoveContainer" containerID="24fccc4d6722769af734c895de91a977e92c669646e950ddbe6b19335ea8a27f" Nov 29 14:53:52 crc kubenswrapper[4907]: I1129 14:53:52.326012 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 14:53:52 crc kubenswrapper[4907]: I1129 14:53:52.328097 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 14:53:52 crc kubenswrapper[4907]: I1129 14:53:52.334859 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 14:53:52 crc kubenswrapper[4907]: I1129 14:53:52.495359 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bd02179-a472-4e40-bd4c-4c6cd9a5155d" path="/var/lib/kubelet/pods/9bd02179-a472-4e40-bd4c-4c6cd9a5155d/volumes" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.122243 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.674913 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Nov 29 14:53:53 crc kubenswrapper[4907]: E1129 14:53:53.675683 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf970f2-06d2-43be-bee5-4bbcd0a147d5" containerName="registry-server" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.675696 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf970f2-06d2-43be-bee5-4bbcd0a147d5" containerName="registry-server" Nov 29 14:53:53 crc kubenswrapper[4907]: E1129 14:53:53.675720 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd02179-a472-4e40-bd4c-4c6cd9a5155d" containerName="extract-utilities" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.675727 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd02179-a472-4e40-bd4c-4c6cd9a5155d" containerName="extract-utilities" Nov 29 14:53:53 crc kubenswrapper[4907]: E1129 14:53:53.675742 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd02179-a472-4e40-bd4c-4c6cd9a5155d" containerName="extract-content" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.675748 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd02179-a472-4e40-bd4c-4c6cd9a5155d" containerName="extract-content" Nov 29 14:53:53 crc kubenswrapper[4907]: E1129 14:53:53.675778 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a1f021-2c7f-497d-9f2e-f64cefe8822d" containerName="aodh-db-sync" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.675784 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a1f021-2c7f-497d-9f2e-f64cefe8822d" containerName="aodh-db-sync" Nov 29 14:53:53 crc kubenswrapper[4907]: E1129 14:53:53.675798 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf970f2-06d2-43be-bee5-4bbcd0a147d5" containerName="extract-content" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.675806 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf970f2-06d2-43be-bee5-4bbcd0a147d5" containerName="extract-content" Nov 29 14:53:53 crc kubenswrapper[4907]: E1129 14:53:53.675819 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd02179-a472-4e40-bd4c-4c6cd9a5155d" containerName="registry-server" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.675825 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd02179-a472-4e40-bd4c-4c6cd9a5155d" containerName="registry-server" Nov 29 14:53:53 crc kubenswrapper[4907]: E1129 14:53:53.675841 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf970f2-06d2-43be-bee5-4bbcd0a147d5" containerName="extract-utilities" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.675846 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf970f2-06d2-43be-bee5-4bbcd0a147d5" containerName="extract-utilities" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.676101 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bf970f2-06d2-43be-bee5-4bbcd0a147d5" containerName="registry-server" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.676114 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="22a1f021-2c7f-497d-9f2e-f64cefe8822d" containerName="aodh-db-sync" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.676124 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd02179-a472-4e40-bd4c-4c6cd9a5155d" containerName="registry-server" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.678109 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.679866 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-b88t7" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.680041 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.680809 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.708648 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.723828 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-config-data\") pod \"aodh-0\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " pod="openstack/aodh-0" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.724108 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " pod="openstack/aodh-0" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.724143 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxqzw\" (UniqueName: \"kubernetes.io/projected/d163ea39-da39-4c8c-8a6a-552e55751e61-kube-api-access-xxqzw\") pod \"aodh-0\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " pod="openstack/aodh-0" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.724201 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-scripts\") pod \"aodh-0\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " pod="openstack/aodh-0" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.779515 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.824994 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zgfr\" (UniqueName: \"kubernetes.io/projected/38bf2698-d87c-4bea-87ae-35b657433ba5-kube-api-access-4zgfr\") pod \"38bf2698-d87c-4bea-87ae-35b657433ba5\" (UID: \"38bf2698-d87c-4bea-87ae-35b657433ba5\") " Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.825060 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38bf2698-d87c-4bea-87ae-35b657433ba5-config-data\") pod \"38bf2698-d87c-4bea-87ae-35b657433ba5\" (UID: \"38bf2698-d87c-4bea-87ae-35b657433ba5\") " Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.825189 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38bf2698-d87c-4bea-87ae-35b657433ba5-combined-ca-bundle\") pod \"38bf2698-d87c-4bea-87ae-35b657433ba5\" (UID: \"38bf2698-d87c-4bea-87ae-35b657433ba5\") " Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.825370 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-config-data\") pod \"aodh-0\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " pod="openstack/aodh-0" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.825781 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " pod="openstack/aodh-0" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.825819 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxqzw\" (UniqueName: \"kubernetes.io/projected/d163ea39-da39-4c8c-8a6a-552e55751e61-kube-api-access-xxqzw\") pod \"aodh-0\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " pod="openstack/aodh-0" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.826240 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-scripts\") pod \"aodh-0\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " pod="openstack/aodh-0" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.831521 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " pod="openstack/aodh-0" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.832932 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-config-data\") pod \"aodh-0\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " pod="openstack/aodh-0" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.838935 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-scripts\") pod \"aodh-0\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " pod="openstack/aodh-0" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.846115 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxqzw\" (UniqueName: \"kubernetes.io/projected/d163ea39-da39-4c8c-8a6a-552e55751e61-kube-api-access-xxqzw\") pod \"aodh-0\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " pod="openstack/aodh-0" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.846501 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bf2698-d87c-4bea-87ae-35b657433ba5-kube-api-access-4zgfr" (OuterVolumeSpecName: "kube-api-access-4zgfr") pod "38bf2698-d87c-4bea-87ae-35b657433ba5" (UID: "38bf2698-d87c-4bea-87ae-35b657433ba5"). InnerVolumeSpecName "kube-api-access-4zgfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.861243 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38bf2698-d87c-4bea-87ae-35b657433ba5-config-data" (OuterVolumeSpecName: "config-data") pod "38bf2698-d87c-4bea-87ae-35b657433ba5" (UID: "38bf2698-d87c-4bea-87ae-35b657433ba5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.867699 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38bf2698-d87c-4bea-87ae-35b657433ba5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38bf2698-d87c-4bea-87ae-35b657433ba5" (UID: "38bf2698-d87c-4bea-87ae-35b657433ba5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.930221 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38bf2698-d87c-4bea-87ae-35b657433ba5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.930253 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zgfr\" (UniqueName: \"kubernetes.io/projected/38bf2698-d87c-4bea-87ae-35b657433ba5-kube-api-access-4zgfr\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:53 crc kubenswrapper[4907]: I1129 14:53:53.930266 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38bf2698-d87c-4bea-87ae-35b657433ba5-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.096407 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.127171 4907 generic.go:334] "Generic (PLEG): container finished" podID="38bf2698-d87c-4bea-87ae-35b657433ba5" containerID="c1cfd3abb4296e9c7f9ed10fc56d9bb2c333bf529f36cd461de9993354f3efb7" exitCode=137 Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.128205 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.128504 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38bf2698-d87c-4bea-87ae-35b657433ba5","Type":"ContainerDied","Data":"c1cfd3abb4296e9c7f9ed10fc56d9bb2c333bf529f36cd461de9993354f3efb7"} Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.128558 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38bf2698-d87c-4bea-87ae-35b657433ba5","Type":"ContainerDied","Data":"fb3f552dca6ac1e9ba3df86381ee11157aaa74c2183236b56afe8b205d8446a9"} Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.128577 4907 scope.go:117] "RemoveContainer" containerID="c1cfd3abb4296e9c7f9ed10fc56d9bb2c333bf529f36cd461de9993354f3efb7" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.166474 4907 scope.go:117] "RemoveContainer" containerID="c1cfd3abb4296e9c7f9ed10fc56d9bb2c333bf529f36cd461de9993354f3efb7" Nov 29 14:53:54 crc kubenswrapper[4907]: E1129 14:53:54.167019 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1cfd3abb4296e9c7f9ed10fc56d9bb2c333bf529f36cd461de9993354f3efb7\": container with ID starting with c1cfd3abb4296e9c7f9ed10fc56d9bb2c333bf529f36cd461de9993354f3efb7 not found: ID does not exist" containerID="c1cfd3abb4296e9c7f9ed10fc56d9bb2c333bf529f36cd461de9993354f3efb7" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.167051 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1cfd3abb4296e9c7f9ed10fc56d9bb2c333bf529f36cd461de9993354f3efb7"} err="failed to get container status \"c1cfd3abb4296e9c7f9ed10fc56d9bb2c333bf529f36cd461de9993354f3efb7\": rpc error: code = NotFound desc = could not find container \"c1cfd3abb4296e9c7f9ed10fc56d9bb2c333bf529f36cd461de9993354f3efb7\": container with ID starting with c1cfd3abb4296e9c7f9ed10fc56d9bb2c333bf529f36cd461de9993354f3efb7 not found: ID does not exist" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.169628 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.185664 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.201144 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 14:53:54 crc kubenswrapper[4907]: E1129 14:53:54.201669 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38bf2698-d87c-4bea-87ae-35b657433ba5" containerName="nova-cell1-novncproxy-novncproxy" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.201686 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="38bf2698-d87c-4bea-87ae-35b657433ba5" containerName="nova-cell1-novncproxy-novncproxy" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.201941 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="38bf2698-d87c-4bea-87ae-35b657433ba5" containerName="nova-cell1-novncproxy-novncproxy" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.202795 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.227198 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.227611 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.227768 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.249760 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phx9n\" (UniqueName: \"kubernetes.io/projected/9a20c0df-80ec-4e00-9bec-409327ec2c90-kube-api-access-phx9n\") pod \"nova-cell1-novncproxy-0\" (UID: \"9a20c0df-80ec-4e00-9bec-409327ec2c90\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.253819 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a20c0df-80ec-4e00-9bec-409327ec2c90-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9a20c0df-80ec-4e00-9bec-409327ec2c90\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.253920 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a20c0df-80ec-4e00-9bec-409327ec2c90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9a20c0df-80ec-4e00-9bec-409327ec2c90\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.254067 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a20c0df-80ec-4e00-9bec-409327ec2c90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9a20c0df-80ec-4e00-9bec-409327ec2c90\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.254151 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a20c0df-80ec-4e00-9bec-409327ec2c90-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9a20c0df-80ec-4e00-9bec-409327ec2c90\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.299650 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.361529 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a20c0df-80ec-4e00-9bec-409327ec2c90-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9a20c0df-80ec-4e00-9bec-409327ec2c90\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.361634 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a20c0df-80ec-4e00-9bec-409327ec2c90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9a20c0df-80ec-4e00-9bec-409327ec2c90\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.361786 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a20c0df-80ec-4e00-9bec-409327ec2c90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9a20c0df-80ec-4e00-9bec-409327ec2c90\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.361871 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a20c0df-80ec-4e00-9bec-409327ec2c90-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9a20c0df-80ec-4e00-9bec-409327ec2c90\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.361993 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phx9n\" (UniqueName: \"kubernetes.io/projected/9a20c0df-80ec-4e00-9bec-409327ec2c90-kube-api-access-phx9n\") pod \"nova-cell1-novncproxy-0\" (UID: \"9a20c0df-80ec-4e00-9bec-409327ec2c90\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.370563 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a20c0df-80ec-4e00-9bec-409327ec2c90-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9a20c0df-80ec-4e00-9bec-409327ec2c90\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.371056 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a20c0df-80ec-4e00-9bec-409327ec2c90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9a20c0df-80ec-4e00-9bec-409327ec2c90\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.384790 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a20c0df-80ec-4e00-9bec-409327ec2c90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9a20c0df-80ec-4e00-9bec-409327ec2c90\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.385539 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a20c0df-80ec-4e00-9bec-409327ec2c90-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9a20c0df-80ec-4e00-9bec-409327ec2c90\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.393966 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phx9n\" (UniqueName: \"kubernetes.io/projected/9a20c0df-80ec-4e00-9bec-409327ec2c90-kube-api-access-phx9n\") pod \"nova-cell1-novncproxy-0\" (UID: \"9a20c0df-80ec-4e00-9bec-409327ec2c90\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.496565 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38bf2698-d87c-4bea-87ae-35b657433ba5" path="/var/lib/kubelet/pods/38bf2698-d87c-4bea-87ae-35b657433ba5/volumes" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.657752 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:54 crc kubenswrapper[4907]: I1129 14:53:54.754154 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 29 14:53:54 crc kubenswrapper[4907]: W1129 14:53:54.756137 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd163ea39_da39_4c8c_8a6a_552e55751e61.slice/crio-17c8f1c1396fe71a6a038e743b999c7422b332781b7f2ce265837e1c7ec34b95 WatchSource:0}: Error finding container 17c8f1c1396fe71a6a038e743b999c7422b332781b7f2ce265837e1c7ec34b95: Status 404 returned error can't find the container with id 17c8f1c1396fe71a6a038e743b999c7422b332781b7f2ce265837e1c7ec34b95 Nov 29 14:53:55 crc kubenswrapper[4907]: I1129 14:53:55.139420 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d163ea39-da39-4c8c-8a6a-552e55751e61","Type":"ContainerStarted","Data":"17c8f1c1396fe71a6a038e743b999c7422b332781b7f2ce265837e1c7ec34b95"} Nov 29 14:53:55 crc kubenswrapper[4907]: I1129 14:53:55.184427 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 14:53:55 crc kubenswrapper[4907]: I1129 14:53:55.747183 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 29 14:53:55 crc kubenswrapper[4907]: I1129 14:53:55.870000 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 14:53:55 crc kubenswrapper[4907]: I1129 14:53:55.871309 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 14:53:55 crc kubenswrapper[4907]: I1129 14:53:55.874474 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 14:53:55 crc kubenswrapper[4907]: I1129 14:53:55.876804 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.148206 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9a20c0df-80ec-4e00-9bec-409327ec2c90","Type":"ContainerStarted","Data":"3a61f4b744b0243565f44929c85f4b35904e7c16ef62e60012257c3d77ec139e"} Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.148259 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9a20c0df-80ec-4e00-9bec-409327ec2c90","Type":"ContainerStarted","Data":"2738193a42ae7e1b49e93d0237290398b5b049682b93a8c3a81706fa7be64703"} Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.151053 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d163ea39-da39-4c8c-8a6a-552e55751e61","Type":"ContainerStarted","Data":"b9bb7292f26b129ba3592868fbff2feb623b014dc4436a0b44016f0a0fa4bba9"} Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.151090 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.172603 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.17258547 podStartE2EDuration="2.17258547s" podCreationTimestamp="2025-11-29 14:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:53:56.162404373 +0000 UTC m=+1534.149242025" watchObservedRunningTime="2025-11-29 14:53:56.17258547 +0000 UTC m=+1534.159423122" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.174231 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.389422 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8"] Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.391713 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.421031 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8"] Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.542895 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkqwc\" (UniqueName: \"kubernetes.io/projected/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-kube-api-access-tkqwc\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.543183 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-config\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.543284 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.543331 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.543503 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.543574 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.644822 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkqwc\" (UniqueName: \"kubernetes.io/projected/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-kube-api-access-tkqwc\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.644957 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-config\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.644987 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.645006 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.645054 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.645084 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.646147 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.646185 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-config\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.646218 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.646398 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.646692 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.667926 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkqwc\" (UniqueName: \"kubernetes.io/projected/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-kube-api-access-tkqwc\") pod \"dnsmasq-dns-6b7bbf7cf9-wtxh8\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.722046 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:53:56 crc kubenswrapper[4907]: I1129 14:53:56.865241 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Nov 29 14:53:57 crc kubenswrapper[4907]: I1129 14:53:57.047133 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:57 crc kubenswrapper[4907]: I1129 14:53:57.047394 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="ceilometer-central-agent" containerID="cri-o://6ae044a335c0711e295c7c6e18a546c7690608b10c13f2d2d7ababdcde020aa0" gracePeriod=30 Nov 29 14:53:57 crc kubenswrapper[4907]: I1129 14:53:57.048141 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="proxy-httpd" containerID="cri-o://50cc74645f167661bd4039818293f5bf16182f90d9975b5bec439e13cdd0ebd2" gracePeriod=30 Nov 29 14:53:57 crc kubenswrapper[4907]: I1129 14:53:57.048182 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="sg-core" containerID="cri-o://a5e21dfe54f590bf40eea699e38869dcb68c98b667e5f2144c85826a3d8a9302" gracePeriod=30 Nov 29 14:53:57 crc kubenswrapper[4907]: I1129 14:53:57.048218 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="ceilometer-notification-agent" containerID="cri-o://aafd8f938f2617b87b37ff985994451ac28aeadc5356a23fdea264fa9979f417" gracePeriod=30 Nov 29 14:53:57 crc kubenswrapper[4907]: I1129 14:53:57.280445 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8"] Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.187843 4907 generic.go:334] "Generic (PLEG): container finished" podID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerID="50cc74645f167661bd4039818293f5bf16182f90d9975b5bec439e13cdd0ebd2" exitCode=0 Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.188076 4907 generic.go:334] "Generic (PLEG): container finished" podID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerID="a5e21dfe54f590bf40eea699e38869dcb68c98b667e5f2144c85826a3d8a9302" exitCode=2 Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.188085 4907 generic.go:334] "Generic (PLEG): container finished" podID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerID="aafd8f938f2617b87b37ff985994451ac28aeadc5356a23fdea264fa9979f417" exitCode=0 Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.188092 4907 generic.go:334] "Generic (PLEG): container finished" podID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerID="6ae044a335c0711e295c7c6e18a546c7690608b10c13f2d2d7ababdcde020aa0" exitCode=0 Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.188140 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9699098b-28d9-42d8-b43e-da4d4a655b8e","Type":"ContainerDied","Data":"50cc74645f167661bd4039818293f5bf16182f90d9975b5bec439e13cdd0ebd2"} Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.188164 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9699098b-28d9-42d8-b43e-da4d4a655b8e","Type":"ContainerDied","Data":"a5e21dfe54f590bf40eea699e38869dcb68c98b667e5f2144c85826a3d8a9302"} Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.188174 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9699098b-28d9-42d8-b43e-da4d4a655b8e","Type":"ContainerDied","Data":"aafd8f938f2617b87b37ff985994451ac28aeadc5356a23fdea264fa9979f417"} Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.188183 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9699098b-28d9-42d8-b43e-da4d4a655b8e","Type":"ContainerDied","Data":"6ae044a335c0711e295c7c6e18a546c7690608b10c13f2d2d7ababdcde020aa0"} Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.201527 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" event={"ID":"c0b2f060-1dbc-40ef-b32f-932d470fb4f6","Type":"ContainerStarted","Data":"ab576c0ed6897a68912b43abf5d358a7cbdf9cac1fad86068e10e0011f29de32"} Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.490009 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.490222 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.490284 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.523825 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9699098b-28d9-42d8-b43e-da4d4a655b8e-run-httpd\") pod \"9699098b-28d9-42d8-b43e-da4d4a655b8e\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.524020 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-sg-core-conf-yaml\") pod \"9699098b-28d9-42d8-b43e-da4d4a655b8e\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.524075 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-scripts\") pod \"9699098b-28d9-42d8-b43e-da4d4a655b8e\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.524116 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q88j\" (UniqueName: \"kubernetes.io/projected/9699098b-28d9-42d8-b43e-da4d4a655b8e-kube-api-access-2q88j\") pod \"9699098b-28d9-42d8-b43e-da4d4a655b8e\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.525258 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9699098b-28d9-42d8-b43e-da4d4a655b8e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9699098b-28d9-42d8-b43e-da4d4a655b8e" (UID: "9699098b-28d9-42d8-b43e-da4d4a655b8e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.531784 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-scripts" (OuterVolumeSpecName: "scripts") pod "9699098b-28d9-42d8-b43e-da4d4a655b8e" (UID: "9699098b-28d9-42d8-b43e-da4d4a655b8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.532358 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9699098b-28d9-42d8-b43e-da4d4a655b8e-kube-api-access-2q88j" (OuterVolumeSpecName: "kube-api-access-2q88j") pod "9699098b-28d9-42d8-b43e-da4d4a655b8e" (UID: "9699098b-28d9-42d8-b43e-da4d4a655b8e"). InnerVolumeSpecName "kube-api-access-2q88j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.539537 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-combined-ca-bundle\") pod \"9699098b-28d9-42d8-b43e-da4d4a655b8e\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.539734 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9699098b-28d9-42d8-b43e-da4d4a655b8e-log-httpd\") pod \"9699098b-28d9-42d8-b43e-da4d4a655b8e\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.539864 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-config-data\") pod \"9699098b-28d9-42d8-b43e-da4d4a655b8e\" (UID: \"9699098b-28d9-42d8-b43e-da4d4a655b8e\") " Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.540477 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9699098b-28d9-42d8-b43e-da4d4a655b8e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9699098b-28d9-42d8-b43e-da4d4a655b8e" (UID: "9699098b-28d9-42d8-b43e-da4d4a655b8e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.540983 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9699098b-28d9-42d8-b43e-da4d4a655b8e-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.541027 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.541037 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q88j\" (UniqueName: \"kubernetes.io/projected/9699098b-28d9-42d8-b43e-da4d4a655b8e-kube-api-access-2q88j\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.541046 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9699098b-28d9-42d8-b43e-da4d4a655b8e-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.627642 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9699098b-28d9-42d8-b43e-da4d4a655b8e" (UID: "9699098b-28d9-42d8-b43e-da4d4a655b8e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.642757 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.669811 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-config-data" (OuterVolumeSpecName: "config-data") pod "9699098b-28d9-42d8-b43e-da4d4a655b8e" (UID: "9699098b-28d9-42d8-b43e-da4d4a655b8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.671999 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9699098b-28d9-42d8-b43e-da4d4a655b8e" (UID: "9699098b-28d9-42d8-b43e-da4d4a655b8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.745568 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:58 crc kubenswrapper[4907]: I1129 14:53:58.745600 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9699098b-28d9-42d8-b43e-da4d4a655b8e-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.224748 4907 generic.go:334] "Generic (PLEG): container finished" podID="c0b2f060-1dbc-40ef-b32f-932d470fb4f6" containerID="3f285d25b679d349b99759a61c35f12cb993237886348e63d5daec8f051eada4" exitCode=0 Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.224824 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" event={"ID":"c0b2f060-1dbc-40ef-b32f-932d470fb4f6","Type":"ContainerDied","Data":"3f285d25b679d349b99759a61c35f12cb993237886348e63d5daec8f051eada4"} Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.254594 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d163ea39-da39-4c8c-8a6a-552e55751e61","Type":"ContainerStarted","Data":"d85a391edaab5d0be88d3300e24d32dc745c97fc6d3a68defaa6068bbba9a714"} Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.263593 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9699098b-28d9-42d8-b43e-da4d4a655b8e","Type":"ContainerDied","Data":"a09c9b921716c79afe50079e43f4d9817be3c1b9e87a9fda653cb227c7d2d83a"} Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.263644 4907 scope.go:117] "RemoveContainer" containerID="50cc74645f167661bd4039818293f5bf16182f90d9975b5bec439e13cdd0ebd2" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.263785 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.478716 4907 scope.go:117] "RemoveContainer" containerID="a5e21dfe54f590bf40eea699e38869dcb68c98b667e5f2144c85826a3d8a9302" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.503007 4907 scope.go:117] "RemoveContainer" containerID="aafd8f938f2617b87b37ff985994451ac28aeadc5356a23fdea264fa9979f417" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.515903 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.530552 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.541488 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:59 crc kubenswrapper[4907]: E1129 14:53:59.542055 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="sg-core" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.542074 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="sg-core" Nov 29 14:53:59 crc kubenswrapper[4907]: E1129 14:53:59.542098 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="ceilometer-notification-agent" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.542105 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="ceilometer-notification-agent" Nov 29 14:53:59 crc kubenswrapper[4907]: E1129 14:53:59.542134 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="ceilometer-central-agent" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.542141 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="ceilometer-central-agent" Nov 29 14:53:59 crc kubenswrapper[4907]: E1129 14:53:59.542153 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="proxy-httpd" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.542161 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="proxy-httpd" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.542400 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="sg-core" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.542423 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="ceilometer-notification-agent" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.542455 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="ceilometer-central-agent" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.542468 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" containerName="proxy-httpd" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.544728 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.549191 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.549359 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.550171 4907 scope.go:117] "RemoveContainer" containerID="6ae044a335c0711e295c7c6e18a546c7690608b10c13f2d2d7ababdcde020aa0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.556483 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.658616 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.666566 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-config-data\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.666621 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.666837 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd91679-7be2-4f92-bc82-52f5592eef70-log-httpd\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.666860 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.666910 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-scripts\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.666964 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd91679-7be2-4f92-bc82-52f5592eef70-run-httpd\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.667037 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm4r5\" (UniqueName: \"kubernetes.io/projected/4cd91679-7be2-4f92-bc82-52f5592eef70-kube-api-access-wm4r5\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.768839 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd91679-7be2-4f92-bc82-52f5592eef70-run-httpd\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.769122 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm4r5\" (UniqueName: \"kubernetes.io/projected/4cd91679-7be2-4f92-bc82-52f5592eef70-kube-api-access-wm4r5\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.769179 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-config-data\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.769199 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.769325 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd91679-7be2-4f92-bc82-52f5592eef70-log-httpd\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.769339 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.769372 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-scripts\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.770537 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd91679-7be2-4f92-bc82-52f5592eef70-run-httpd\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.770620 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd91679-7be2-4f92-bc82-52f5592eef70-log-httpd\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.773938 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.774879 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-config-data\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.776416 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-scripts\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.778380 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.788365 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm4r5\" (UniqueName: \"kubernetes.io/projected/4cd91679-7be2-4f92-bc82-52f5592eef70-kube-api-access-wm4r5\") pod \"ceilometer-0\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " pod="openstack/ceilometer-0" Nov 29 14:53:59 crc kubenswrapper[4907]: I1129 14:53:59.860744 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:54:00 crc kubenswrapper[4907]: I1129 14:54:00.214457 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:54:00 crc kubenswrapper[4907]: I1129 14:54:00.214683 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d4eabed3-66b5-45e3-a2d6-a174b4064289" containerName="nova-api-log" containerID="cri-o://64617b44e3d12fb59fe2b9c108f1791a9cb5fa254f91b00d109133fb3b35873b" gracePeriod=30 Nov 29 14:54:00 crc kubenswrapper[4907]: I1129 14:54:00.214784 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d4eabed3-66b5-45e3-a2d6-a174b4064289" containerName="nova-api-api" containerID="cri-o://b19b6c34634174b118f1075cab767e2eee77633528f9c9a2cdb802987d191500" gracePeriod=30 Nov 29 14:54:00 crc kubenswrapper[4907]: I1129 14:54:00.282792 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" event={"ID":"c0b2f060-1dbc-40ef-b32f-932d470fb4f6","Type":"ContainerStarted","Data":"dfc52db0ff989888f36a5a0810924b1055be1105eb4b41652ff730a6a4e347bc"} Nov 29 14:54:00 crc kubenswrapper[4907]: I1129 14:54:00.283535 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:54:00 crc kubenswrapper[4907]: I1129 14:54:00.312726 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" podStartSLOduration=4.31270654 podStartE2EDuration="4.31270654s" podCreationTimestamp="2025-11-29 14:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:54:00.312505594 +0000 UTC m=+1538.299343256" watchObservedRunningTime="2025-11-29 14:54:00.31270654 +0000 UTC m=+1538.299544192" Nov 29 14:54:00 crc kubenswrapper[4907]: I1129 14:54:00.502348 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9699098b-28d9-42d8-b43e-da4d4a655b8e" path="/var/lib/kubelet/pods/9699098b-28d9-42d8-b43e-da4d4a655b8e/volumes" Nov 29 14:54:00 crc kubenswrapper[4907]: I1129 14:54:00.652241 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:54:00 crc kubenswrapper[4907]: I1129 14:54:00.889109 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:54:00 crc kubenswrapper[4907]: W1129 14:54:00.891542 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cd91679_7be2_4f92_bc82_52f5592eef70.slice/crio-92dec910c053d1032206508ab3066e04672d30b40f4cb1d4db3ddbe74dc24ad9 WatchSource:0}: Error finding container 92dec910c053d1032206508ab3066e04672d30b40f4cb1d4db3ddbe74dc24ad9: Status 404 returned error can't find the container with id 92dec910c053d1032206508ab3066e04672d30b40f4cb1d4db3ddbe74dc24ad9 Nov 29 14:54:01 crc kubenswrapper[4907]: I1129 14:54:01.297031 4907 generic.go:334] "Generic (PLEG): container finished" podID="d4eabed3-66b5-45e3-a2d6-a174b4064289" containerID="64617b44e3d12fb59fe2b9c108f1791a9cb5fa254f91b00d109133fb3b35873b" exitCode=143 Nov 29 14:54:01 crc kubenswrapper[4907]: I1129 14:54:01.297163 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4eabed3-66b5-45e3-a2d6-a174b4064289","Type":"ContainerDied","Data":"64617b44e3d12fb59fe2b9c108f1791a9cb5fa254f91b00d109133fb3b35873b"} Nov 29 14:54:01 crc kubenswrapper[4907]: I1129 14:54:01.303648 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d163ea39-da39-4c8c-8a6a-552e55751e61","Type":"ContainerStarted","Data":"318318d367a07d639eeaed76fbc21cf04a5e2a945a46efe1c1f75d3eb3913e78"} Nov 29 14:54:01 crc kubenswrapper[4907]: I1129 14:54:01.306428 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd91679-7be2-4f92-bc82-52f5592eef70","Type":"ContainerStarted","Data":"92dec910c053d1032206508ab3066e04672d30b40f4cb1d4db3ddbe74dc24ad9"} Nov 29 14:54:02 crc kubenswrapper[4907]: I1129 14:54:02.323113 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd91679-7be2-4f92-bc82-52f5592eef70","Type":"ContainerStarted","Data":"0268f344f883732cbefceb39bf0b46f91317b3e03f813c41368edb0f630b9639"} Nov 29 14:54:03 crc kubenswrapper[4907]: I1129 14:54:03.346519 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d163ea39-da39-4c8c-8a6a-552e55751e61","Type":"ContainerStarted","Data":"df6ea71164202b256f79c9de504d77a27c8f1a9bfdb68780a1b3718ef14a71ad"} Nov 29 14:54:03 crc kubenswrapper[4907]: I1129 14:54:03.346792 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-api" containerID="cri-o://b9bb7292f26b129ba3592868fbff2feb623b014dc4436a0b44016f0a0fa4bba9" gracePeriod=30 Nov 29 14:54:03 crc kubenswrapper[4907]: I1129 14:54:03.347336 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-listener" containerID="cri-o://df6ea71164202b256f79c9de504d77a27c8f1a9bfdb68780a1b3718ef14a71ad" gracePeriod=30 Nov 29 14:54:03 crc kubenswrapper[4907]: I1129 14:54:03.347350 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-notifier" containerID="cri-o://318318d367a07d639eeaed76fbc21cf04a5e2a945a46efe1c1f75d3eb3913e78" gracePeriod=30 Nov 29 14:54:03 crc kubenswrapper[4907]: I1129 14:54:03.347362 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-evaluator" containerID="cri-o://d85a391edaab5d0be88d3300e24d32dc745c97fc6d3a68defaa6068bbba9a714" gracePeriod=30 Nov 29 14:54:03 crc kubenswrapper[4907]: I1129 14:54:03.351414 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd91679-7be2-4f92-bc82-52f5592eef70","Type":"ContainerStarted","Data":"a7b177bcc7cadd38cb4990f44c9bfeb9e89fd08053ee52413e6ad59c928662c0"} Nov 29 14:54:03 crc kubenswrapper[4907]: I1129 14:54:03.387701 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.06158813 podStartE2EDuration="10.387678728s" podCreationTimestamp="2025-11-29 14:53:53 +0000 UTC" firstStartedPulling="2025-11-29 14:53:54.770731594 +0000 UTC m=+1532.757569246" lastFinishedPulling="2025-11-29 14:54:02.096822192 +0000 UTC m=+1540.083659844" observedRunningTime="2025-11-29 14:54:03.367573141 +0000 UTC m=+1541.354410793" watchObservedRunningTime="2025-11-29 14:54:03.387678728 +0000 UTC m=+1541.374516390" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.088280 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.234372 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjzqp\" (UniqueName: \"kubernetes.io/projected/d4eabed3-66b5-45e3-a2d6-a174b4064289-kube-api-access-kjzqp\") pod \"d4eabed3-66b5-45e3-a2d6-a174b4064289\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.234695 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4eabed3-66b5-45e3-a2d6-a174b4064289-combined-ca-bundle\") pod \"d4eabed3-66b5-45e3-a2d6-a174b4064289\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.234825 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4eabed3-66b5-45e3-a2d6-a174b4064289-logs\") pod \"d4eabed3-66b5-45e3-a2d6-a174b4064289\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.234978 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4eabed3-66b5-45e3-a2d6-a174b4064289-config-data\") pod \"d4eabed3-66b5-45e3-a2d6-a174b4064289\" (UID: \"d4eabed3-66b5-45e3-a2d6-a174b4064289\") " Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.237487 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4eabed3-66b5-45e3-a2d6-a174b4064289-logs" (OuterVolumeSpecName: "logs") pod "d4eabed3-66b5-45e3-a2d6-a174b4064289" (UID: "d4eabed3-66b5-45e3-a2d6-a174b4064289"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.247749 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4eabed3-66b5-45e3-a2d6-a174b4064289-kube-api-access-kjzqp" (OuterVolumeSpecName: "kube-api-access-kjzqp") pod "d4eabed3-66b5-45e3-a2d6-a174b4064289" (UID: "d4eabed3-66b5-45e3-a2d6-a174b4064289"). InnerVolumeSpecName "kube-api-access-kjzqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.286917 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4eabed3-66b5-45e3-a2d6-a174b4064289-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4eabed3-66b5-45e3-a2d6-a174b4064289" (UID: "d4eabed3-66b5-45e3-a2d6-a174b4064289"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.288767 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4eabed3-66b5-45e3-a2d6-a174b4064289-config-data" (OuterVolumeSpecName: "config-data") pod "d4eabed3-66b5-45e3-a2d6-a174b4064289" (UID: "d4eabed3-66b5-45e3-a2d6-a174b4064289"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.340568 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjzqp\" (UniqueName: \"kubernetes.io/projected/d4eabed3-66b5-45e3-a2d6-a174b4064289-kube-api-access-kjzqp\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.340597 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4eabed3-66b5-45e3-a2d6-a174b4064289-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.340609 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4eabed3-66b5-45e3-a2d6-a174b4064289-logs\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.340621 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4eabed3-66b5-45e3-a2d6-a174b4064289-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.367046 4907 generic.go:334] "Generic (PLEG): container finished" podID="d4eabed3-66b5-45e3-a2d6-a174b4064289" containerID="b19b6c34634174b118f1075cab767e2eee77633528f9c9a2cdb802987d191500" exitCode=0 Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.368252 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.370421 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4eabed3-66b5-45e3-a2d6-a174b4064289","Type":"ContainerDied","Data":"b19b6c34634174b118f1075cab767e2eee77633528f9c9a2cdb802987d191500"} Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.370508 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4eabed3-66b5-45e3-a2d6-a174b4064289","Type":"ContainerDied","Data":"b9053da85f02158e838c16254c68717d2925b3e504c53d53048532c49810988a"} Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.370526 4907 scope.go:117] "RemoveContainer" containerID="b19b6c34634174b118f1075cab767e2eee77633528f9c9a2cdb802987d191500" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.380306 4907 generic.go:334] "Generic (PLEG): container finished" podID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerID="318318d367a07d639eeaed76fbc21cf04a5e2a945a46efe1c1f75d3eb3913e78" exitCode=0 Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.380575 4907 generic.go:334] "Generic (PLEG): container finished" podID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerID="d85a391edaab5d0be88d3300e24d32dc745c97fc6d3a68defaa6068bbba9a714" exitCode=0 Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.380584 4907 generic.go:334] "Generic (PLEG): container finished" podID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerID="b9bb7292f26b129ba3592868fbff2feb623b014dc4436a0b44016f0a0fa4bba9" exitCode=0 Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.380455 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d163ea39-da39-4c8c-8a6a-552e55751e61","Type":"ContainerDied","Data":"318318d367a07d639eeaed76fbc21cf04a5e2a945a46efe1c1f75d3eb3913e78"} Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.380667 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d163ea39-da39-4c8c-8a6a-552e55751e61","Type":"ContainerDied","Data":"d85a391edaab5d0be88d3300e24d32dc745c97fc6d3a68defaa6068bbba9a714"} Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.380681 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d163ea39-da39-4c8c-8a6a-552e55751e61","Type":"ContainerDied","Data":"b9bb7292f26b129ba3592868fbff2feb623b014dc4436a0b44016f0a0fa4bba9"} Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.385287 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd91679-7be2-4f92-bc82-52f5592eef70","Type":"ContainerStarted","Data":"7b51a414f62c6912c4a017f8b539b336b86dcae82c3805a1d660c6e8429b41a6"} Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.419381 4907 scope.go:117] "RemoveContainer" containerID="64617b44e3d12fb59fe2b9c108f1791a9cb5fa254f91b00d109133fb3b35873b" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.424262 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.469650 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.515536 4907 scope.go:117] "RemoveContainer" containerID="b19b6c34634174b118f1075cab767e2eee77633528f9c9a2cdb802987d191500" Nov 29 14:54:04 crc kubenswrapper[4907]: E1129 14:54:04.516278 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b19b6c34634174b118f1075cab767e2eee77633528f9c9a2cdb802987d191500\": container with ID starting with b19b6c34634174b118f1075cab767e2eee77633528f9c9a2cdb802987d191500 not found: ID does not exist" containerID="b19b6c34634174b118f1075cab767e2eee77633528f9c9a2cdb802987d191500" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.516301 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b19b6c34634174b118f1075cab767e2eee77633528f9c9a2cdb802987d191500"} err="failed to get container status \"b19b6c34634174b118f1075cab767e2eee77633528f9c9a2cdb802987d191500\": rpc error: code = NotFound desc = could not find container \"b19b6c34634174b118f1075cab767e2eee77633528f9c9a2cdb802987d191500\": container with ID starting with b19b6c34634174b118f1075cab767e2eee77633528f9c9a2cdb802987d191500 not found: ID does not exist" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.516319 4907 scope.go:117] "RemoveContainer" containerID="64617b44e3d12fb59fe2b9c108f1791a9cb5fa254f91b00d109133fb3b35873b" Nov 29 14:54:04 crc kubenswrapper[4907]: E1129 14:54:04.516660 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64617b44e3d12fb59fe2b9c108f1791a9cb5fa254f91b00d109133fb3b35873b\": container with ID starting with 64617b44e3d12fb59fe2b9c108f1791a9cb5fa254f91b00d109133fb3b35873b not found: ID does not exist" containerID="64617b44e3d12fb59fe2b9c108f1791a9cb5fa254f91b00d109133fb3b35873b" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.516690 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64617b44e3d12fb59fe2b9c108f1791a9cb5fa254f91b00d109133fb3b35873b"} err="failed to get container status \"64617b44e3d12fb59fe2b9c108f1791a9cb5fa254f91b00d109133fb3b35873b\": rpc error: code = NotFound desc = could not find container \"64617b44e3d12fb59fe2b9c108f1791a9cb5fa254f91b00d109133fb3b35873b\": container with ID starting with 64617b44e3d12fb59fe2b9c108f1791a9cb5fa254f91b00d109133fb3b35873b not found: ID does not exist" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.522651 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4eabed3-66b5-45e3-a2d6-a174b4064289" path="/var/lib/kubelet/pods/d4eabed3-66b5-45e3-a2d6-a174b4064289/volumes" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.523294 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 14:54:04 crc kubenswrapper[4907]: E1129 14:54:04.523706 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4eabed3-66b5-45e3-a2d6-a174b4064289" containerName="nova-api-api" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.523724 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4eabed3-66b5-45e3-a2d6-a174b4064289" containerName="nova-api-api" Nov 29 14:54:04 crc kubenswrapper[4907]: E1129 14:54:04.523737 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4eabed3-66b5-45e3-a2d6-a174b4064289" containerName="nova-api-log" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.523743 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4eabed3-66b5-45e3-a2d6-a174b4064289" containerName="nova-api-log" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.523956 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4eabed3-66b5-45e3-a2d6-a174b4064289" containerName="nova-api-api" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.523976 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4eabed3-66b5-45e3-a2d6-a174b4064289" containerName="nova-api-log" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.525417 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.525835 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.529771 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.530027 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.530562 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.652497 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.652567 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-public-tls-certs\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.652691 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e92204-42b7-473a-aca5-3c2f480f8601-logs\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.652768 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-config-data\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.652859 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79ndh\" (UniqueName: \"kubernetes.io/projected/f3e92204-42b7-473a-aca5-3c2f480f8601-kube-api-access-79ndh\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.652912 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.657971 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.675250 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.755171 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.755247 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-public-tls-certs\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.755339 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e92204-42b7-473a-aca5-3c2f480f8601-logs\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.755396 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-config-data\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.755483 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79ndh\" (UniqueName: \"kubernetes.io/projected/f3e92204-42b7-473a-aca5-3c2f480f8601-kube-api-access-79ndh\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.755520 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.756098 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e92204-42b7-473a-aca5-3c2f480f8601-logs\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.759428 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.759688 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-public-tls-certs\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.760168 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-config-data\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.761156 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.779021 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79ndh\" (UniqueName: \"kubernetes.io/projected/f3e92204-42b7-473a-aca5-3c2f480f8601-kube-api-access-79ndh\") pod \"nova-api-0\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " pod="openstack/nova-api-0" Nov 29 14:54:04 crc kubenswrapper[4907]: I1129 14:54:04.868487 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.340717 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.402557 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3e92204-42b7-473a-aca5-3c2f480f8601","Type":"ContainerStarted","Data":"05b6c4faf4352cc123212a198b960c187e88ca7180136870d419b847f59c4a8b"} Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.422864 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.747261 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-dhqrp"] Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.749187 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.753195 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.753377 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.760602 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dhqrp"] Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.890545 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj6zw\" (UniqueName: \"kubernetes.io/projected/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-kube-api-access-nj6zw\") pod \"nova-cell1-cell-mapping-dhqrp\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.890601 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-config-data\") pod \"nova-cell1-cell-mapping-dhqrp\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.890634 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dhqrp\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.890692 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-scripts\") pod \"nova-cell1-cell-mapping-dhqrp\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.992527 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj6zw\" (UniqueName: \"kubernetes.io/projected/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-kube-api-access-nj6zw\") pod \"nova-cell1-cell-mapping-dhqrp\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.993323 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-config-data\") pod \"nova-cell1-cell-mapping-dhqrp\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.993990 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dhqrp\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.994046 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-scripts\") pod \"nova-cell1-cell-mapping-dhqrp\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.998696 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dhqrp\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:05 crc kubenswrapper[4907]: I1129 14:54:05.999767 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-config-data\") pod \"nova-cell1-cell-mapping-dhqrp\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.000125 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-scripts\") pod \"nova-cell1-cell-mapping-dhqrp\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.009411 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj6zw\" (UniqueName: \"kubernetes.io/projected/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-kube-api-access-nj6zw\") pod \"nova-cell1-cell-mapping-dhqrp\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.127723 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.419286 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd91679-7be2-4f92-bc82-52f5592eef70","Type":"ContainerStarted","Data":"0e372bb41f342f3817d808714ba970481dd74d13159dc91f6af8147992b474df"} Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.419536 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="ceilometer-central-agent" containerID="cri-o://0268f344f883732cbefceb39bf0b46f91317b3e03f813c41368edb0f630b9639" gracePeriod=30 Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.420005 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.420506 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="proxy-httpd" containerID="cri-o://0e372bb41f342f3817d808714ba970481dd74d13159dc91f6af8147992b474df" gracePeriod=30 Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.420593 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="sg-core" containerID="cri-o://7b51a414f62c6912c4a017f8b539b336b86dcae82c3805a1d660c6e8429b41a6" gracePeriod=30 Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.420645 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="ceilometer-notification-agent" containerID="cri-o://a7b177bcc7cadd38cb4990f44c9bfeb9e89fd08053ee52413e6ad59c928662c0" gracePeriod=30 Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.430936 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3e92204-42b7-473a-aca5-3c2f480f8601","Type":"ContainerStarted","Data":"4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce"} Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.430968 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3e92204-42b7-473a-aca5-3c2f480f8601","Type":"ContainerStarted","Data":"b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c"} Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.475753 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.146925806 podStartE2EDuration="7.475711805s" podCreationTimestamp="2025-11-29 14:53:59 +0000 UTC" firstStartedPulling="2025-11-29 14:54:00.894287038 +0000 UTC m=+1538.881124690" lastFinishedPulling="2025-11-29 14:54:05.223073037 +0000 UTC m=+1543.209910689" observedRunningTime="2025-11-29 14:54:06.449515076 +0000 UTC m=+1544.436352728" watchObservedRunningTime="2025-11-29 14:54:06.475711805 +0000 UTC m=+1544.462549457" Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.504607 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.504589149 podStartE2EDuration="2.504589149s" podCreationTimestamp="2025-11-29 14:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:54:06.477227047 +0000 UTC m=+1544.464064709" watchObservedRunningTime="2025-11-29 14:54:06.504589149 +0000 UTC m=+1544.491426801" Nov 29 14:54:06 crc kubenswrapper[4907]: W1129 14:54:06.633842 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d92c9d8_fedb_4ab8_ac88_7e6a3b0738c6.slice/crio-74d327ebd750119f34d5b3d7ad29a3b1b422fdd62e5abfb6e6bd4df0ddb0c134 WatchSource:0}: Error finding container 74d327ebd750119f34d5b3d7ad29a3b1b422fdd62e5abfb6e6bd4df0ddb0c134: Status 404 returned error can't find the container with id 74d327ebd750119f34d5b3d7ad29a3b1b422fdd62e5abfb6e6bd4df0ddb0c134 Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.640611 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dhqrp"] Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.723582 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.937407 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-vtdtq"] Nov 29 14:54:06 crc kubenswrapper[4907]: I1129 14:54:06.937660 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" podUID="dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3" containerName="dnsmasq-dns" containerID="cri-o://42554551cad1d9876fe163a761d3b2631971379fdde2e0cfff1a61e784378ca7" gracePeriod=10 Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.439496 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dhqrp" event={"ID":"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6","Type":"ContainerStarted","Data":"5bccaa7969a6878d47cce9db04567f931f270aa24332b73632764fd38abdaf16"} Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.439769 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dhqrp" event={"ID":"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6","Type":"ContainerStarted","Data":"74d327ebd750119f34d5b3d7ad29a3b1b422fdd62e5abfb6e6bd4df0ddb0c134"} Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.443849 4907 generic.go:334] "Generic (PLEG): container finished" podID="dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3" containerID="42554551cad1d9876fe163a761d3b2631971379fdde2e0cfff1a61e784378ca7" exitCode=0 Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.443888 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" event={"ID":"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3","Type":"ContainerDied","Data":"42554551cad1d9876fe163a761d3b2631971379fdde2e0cfff1a61e784378ca7"} Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.456807 4907 generic.go:334] "Generic (PLEG): container finished" podID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerID="0e372bb41f342f3817d808714ba970481dd74d13159dc91f6af8147992b474df" exitCode=0 Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.456843 4907 generic.go:334] "Generic (PLEG): container finished" podID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerID="7b51a414f62c6912c4a017f8b539b336b86dcae82c3805a1d660c6e8429b41a6" exitCode=2 Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.456852 4907 generic.go:334] "Generic (PLEG): container finished" podID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerID="a7b177bcc7cadd38cb4990f44c9bfeb9e89fd08053ee52413e6ad59c928662c0" exitCode=0 Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.456895 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd91679-7be2-4f92-bc82-52f5592eef70","Type":"ContainerDied","Data":"0e372bb41f342f3817d808714ba970481dd74d13159dc91f6af8147992b474df"} Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.456951 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd91679-7be2-4f92-bc82-52f5592eef70","Type":"ContainerDied","Data":"7b51a414f62c6912c4a017f8b539b336b86dcae82c3805a1d660c6e8429b41a6"} Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.456966 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd91679-7be2-4f92-bc82-52f5592eef70","Type":"ContainerDied","Data":"a7b177bcc7cadd38cb4990f44c9bfeb9e89fd08053ee52413e6ad59c928662c0"} Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.458267 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-dhqrp" podStartSLOduration=2.458247988 podStartE2EDuration="2.458247988s" podCreationTimestamp="2025-11-29 14:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:54:07.458047612 +0000 UTC m=+1545.444885264" watchObservedRunningTime="2025-11-29 14:54:07.458247988 +0000 UTC m=+1545.445085640" Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.533983 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.649564 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-ovsdbserver-nb\") pod \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.649856 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-config\") pod \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.649947 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-dns-swift-storage-0\") pod \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.650054 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-dns-svc\") pod \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.650203 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-ovsdbserver-sb\") pod \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.650324 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9lk7\" (UniqueName: \"kubernetes.io/projected/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-kube-api-access-k9lk7\") pod \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\" (UID: \"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3\") " Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.658003 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-kube-api-access-k9lk7" (OuterVolumeSpecName: "kube-api-access-k9lk7") pod "dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3" (UID: "dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3"). InnerVolumeSpecName "kube-api-access-k9lk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.719664 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-config" (OuterVolumeSpecName: "config") pod "dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3" (UID: "dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.721938 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3" (UID: "dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.728922 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3" (UID: "dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.742879 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3" (UID: "dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.748530 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3" (UID: "dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.754738 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.754807 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.754823 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.754835 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.754902 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:07 crc kubenswrapper[4907]: I1129 14:54:07.754915 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9lk7\" (UniqueName: \"kubernetes.io/projected/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3-kube-api-access-k9lk7\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:08 crc kubenswrapper[4907]: I1129 14:54:08.471807 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" event={"ID":"dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3","Type":"ContainerDied","Data":"62a37cf7298b01083ef1722572a7cc37fd183010caa16a44338c05255cc6610c"} Nov 29 14:54:08 crc kubenswrapper[4907]: I1129 14:54:08.471865 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-vtdtq" Nov 29 14:54:08 crc kubenswrapper[4907]: I1129 14:54:08.471887 4907 scope.go:117] "RemoveContainer" containerID="42554551cad1d9876fe163a761d3b2631971379fdde2e0cfff1a61e784378ca7" Nov 29 14:54:08 crc kubenswrapper[4907]: I1129 14:54:08.499160 4907 scope.go:117] "RemoveContainer" containerID="bb0dca8b765198896fe43bcba8ea0eb32f8be1c8334aaf1cd72607d3311ee860" Nov 29 14:54:08 crc kubenswrapper[4907]: I1129 14:54:08.535731 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-vtdtq"] Nov 29 14:54:08 crc kubenswrapper[4907]: I1129 14:54:08.547513 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-vtdtq"] Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.395640 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.500377 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3" path="/var/lib/kubelet/pods/dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3/volumes" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.506005 4907 generic.go:334] "Generic (PLEG): container finished" podID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerID="0268f344f883732cbefceb39bf0b46f91317b3e03f813c41368edb0f630b9639" exitCode=0 Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.506047 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd91679-7be2-4f92-bc82-52f5592eef70","Type":"ContainerDied","Data":"0268f344f883732cbefceb39bf0b46f91317b3e03f813c41368edb0f630b9639"} Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.506073 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd91679-7be2-4f92-bc82-52f5592eef70","Type":"ContainerDied","Data":"92dec910c053d1032206508ab3066e04672d30b40f4cb1d4db3ddbe74dc24ad9"} Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.506074 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.506091 4907 scope.go:117] "RemoveContainer" containerID="0e372bb41f342f3817d808714ba970481dd74d13159dc91f6af8147992b474df" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.533357 4907 scope.go:117] "RemoveContainer" containerID="7b51a414f62c6912c4a017f8b539b336b86dcae82c3805a1d660c6e8429b41a6" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.538807 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-combined-ca-bundle\") pod \"4cd91679-7be2-4f92-bc82-52f5592eef70\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.538946 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm4r5\" (UniqueName: \"kubernetes.io/projected/4cd91679-7be2-4f92-bc82-52f5592eef70-kube-api-access-wm4r5\") pod \"4cd91679-7be2-4f92-bc82-52f5592eef70\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.538993 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-scripts\") pod \"4cd91679-7be2-4f92-bc82-52f5592eef70\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.539076 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd91679-7be2-4f92-bc82-52f5592eef70-run-httpd\") pod \"4cd91679-7be2-4f92-bc82-52f5592eef70\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.539164 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-config-data\") pod \"4cd91679-7be2-4f92-bc82-52f5592eef70\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.539645 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd91679-7be2-4f92-bc82-52f5592eef70-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4cd91679-7be2-4f92-bc82-52f5592eef70" (UID: "4cd91679-7be2-4f92-bc82-52f5592eef70"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.539709 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-sg-core-conf-yaml\") pod \"4cd91679-7be2-4f92-bc82-52f5592eef70\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.540048 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd91679-7be2-4f92-bc82-52f5592eef70-log-httpd\") pod \"4cd91679-7be2-4f92-bc82-52f5592eef70\" (UID: \"4cd91679-7be2-4f92-bc82-52f5592eef70\") " Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.540666 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd91679-7be2-4f92-bc82-52f5592eef70-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.540955 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd91679-7be2-4f92-bc82-52f5592eef70-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4cd91679-7be2-4f92-bc82-52f5592eef70" (UID: "4cd91679-7be2-4f92-bc82-52f5592eef70"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.554765 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-scripts" (OuterVolumeSpecName: "scripts") pod "4cd91679-7be2-4f92-bc82-52f5592eef70" (UID: "4cd91679-7be2-4f92-bc82-52f5592eef70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.557856 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd91679-7be2-4f92-bc82-52f5592eef70-kube-api-access-wm4r5" (OuterVolumeSpecName: "kube-api-access-wm4r5") pod "4cd91679-7be2-4f92-bc82-52f5592eef70" (UID: "4cd91679-7be2-4f92-bc82-52f5592eef70"). InnerVolumeSpecName "kube-api-access-wm4r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.569205 4907 scope.go:117] "RemoveContainer" containerID="a7b177bcc7cadd38cb4990f44c9bfeb9e89fd08053ee52413e6ad59c928662c0" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.578584 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4cd91679-7be2-4f92-bc82-52f5592eef70" (UID: "4cd91679-7be2-4f92-bc82-52f5592eef70"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.642804 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm4r5\" (UniqueName: \"kubernetes.io/projected/4cd91679-7be2-4f92-bc82-52f5592eef70-kube-api-access-wm4r5\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.642849 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.642858 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.642866 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd91679-7be2-4f92-bc82-52f5592eef70-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.683768 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cd91679-7be2-4f92-bc82-52f5592eef70" (UID: "4cd91679-7be2-4f92-bc82-52f5592eef70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.696567 4907 scope.go:117] "RemoveContainer" containerID="0268f344f883732cbefceb39bf0b46f91317b3e03f813c41368edb0f630b9639" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.703293 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-config-data" (OuterVolumeSpecName: "config-data") pod "4cd91679-7be2-4f92-bc82-52f5592eef70" (UID: "4cd91679-7be2-4f92-bc82-52f5592eef70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.719714 4907 scope.go:117] "RemoveContainer" containerID="0e372bb41f342f3817d808714ba970481dd74d13159dc91f6af8147992b474df" Nov 29 14:54:10 crc kubenswrapper[4907]: E1129 14:54:10.720221 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e372bb41f342f3817d808714ba970481dd74d13159dc91f6af8147992b474df\": container with ID starting with 0e372bb41f342f3817d808714ba970481dd74d13159dc91f6af8147992b474df not found: ID does not exist" containerID="0e372bb41f342f3817d808714ba970481dd74d13159dc91f6af8147992b474df" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.720247 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e372bb41f342f3817d808714ba970481dd74d13159dc91f6af8147992b474df"} err="failed to get container status \"0e372bb41f342f3817d808714ba970481dd74d13159dc91f6af8147992b474df\": rpc error: code = NotFound desc = could not find container \"0e372bb41f342f3817d808714ba970481dd74d13159dc91f6af8147992b474df\": container with ID starting with 0e372bb41f342f3817d808714ba970481dd74d13159dc91f6af8147992b474df not found: ID does not exist" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.720268 4907 scope.go:117] "RemoveContainer" containerID="7b51a414f62c6912c4a017f8b539b336b86dcae82c3805a1d660c6e8429b41a6" Nov 29 14:54:10 crc kubenswrapper[4907]: E1129 14:54:10.720535 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b51a414f62c6912c4a017f8b539b336b86dcae82c3805a1d660c6e8429b41a6\": container with ID starting with 7b51a414f62c6912c4a017f8b539b336b86dcae82c3805a1d660c6e8429b41a6 not found: ID does not exist" containerID="7b51a414f62c6912c4a017f8b539b336b86dcae82c3805a1d660c6e8429b41a6" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.720552 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b51a414f62c6912c4a017f8b539b336b86dcae82c3805a1d660c6e8429b41a6"} err="failed to get container status \"7b51a414f62c6912c4a017f8b539b336b86dcae82c3805a1d660c6e8429b41a6\": rpc error: code = NotFound desc = could not find container \"7b51a414f62c6912c4a017f8b539b336b86dcae82c3805a1d660c6e8429b41a6\": container with ID starting with 7b51a414f62c6912c4a017f8b539b336b86dcae82c3805a1d660c6e8429b41a6 not found: ID does not exist" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.720567 4907 scope.go:117] "RemoveContainer" containerID="a7b177bcc7cadd38cb4990f44c9bfeb9e89fd08053ee52413e6ad59c928662c0" Nov 29 14:54:10 crc kubenswrapper[4907]: E1129 14:54:10.720802 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7b177bcc7cadd38cb4990f44c9bfeb9e89fd08053ee52413e6ad59c928662c0\": container with ID starting with a7b177bcc7cadd38cb4990f44c9bfeb9e89fd08053ee52413e6ad59c928662c0 not found: ID does not exist" containerID="a7b177bcc7cadd38cb4990f44c9bfeb9e89fd08053ee52413e6ad59c928662c0" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.720823 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7b177bcc7cadd38cb4990f44c9bfeb9e89fd08053ee52413e6ad59c928662c0"} err="failed to get container status \"a7b177bcc7cadd38cb4990f44c9bfeb9e89fd08053ee52413e6ad59c928662c0\": rpc error: code = NotFound desc = could not find container \"a7b177bcc7cadd38cb4990f44c9bfeb9e89fd08053ee52413e6ad59c928662c0\": container with ID starting with a7b177bcc7cadd38cb4990f44c9bfeb9e89fd08053ee52413e6ad59c928662c0 not found: ID does not exist" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.720835 4907 scope.go:117] "RemoveContainer" containerID="0268f344f883732cbefceb39bf0b46f91317b3e03f813c41368edb0f630b9639" Nov 29 14:54:10 crc kubenswrapper[4907]: E1129 14:54:10.721126 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0268f344f883732cbefceb39bf0b46f91317b3e03f813c41368edb0f630b9639\": container with ID starting with 0268f344f883732cbefceb39bf0b46f91317b3e03f813c41368edb0f630b9639 not found: ID does not exist" containerID="0268f344f883732cbefceb39bf0b46f91317b3e03f813c41368edb0f630b9639" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.721148 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0268f344f883732cbefceb39bf0b46f91317b3e03f813c41368edb0f630b9639"} err="failed to get container status \"0268f344f883732cbefceb39bf0b46f91317b3e03f813c41368edb0f630b9639\": rpc error: code = NotFound desc = could not find container \"0268f344f883732cbefceb39bf0b46f91317b3e03f813c41368edb0f630b9639\": container with ID starting with 0268f344f883732cbefceb39bf0b46f91317b3e03f813c41368edb0f630b9639 not found: ID does not exist" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.745211 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.745245 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd91679-7be2-4f92-bc82-52f5592eef70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.845047 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.862368 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.878005 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:54:10 crc kubenswrapper[4907]: E1129 14:54:10.878616 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3" containerName="init" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.878636 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3" containerName="init" Nov 29 14:54:10 crc kubenswrapper[4907]: E1129 14:54:10.878669 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3" containerName="dnsmasq-dns" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.878681 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3" containerName="dnsmasq-dns" Nov 29 14:54:10 crc kubenswrapper[4907]: E1129 14:54:10.878695 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="ceilometer-notification-agent" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.878704 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="ceilometer-notification-agent" Nov 29 14:54:10 crc kubenswrapper[4907]: E1129 14:54:10.878723 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="ceilometer-central-agent" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.878732 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="ceilometer-central-agent" Nov 29 14:54:10 crc kubenswrapper[4907]: E1129 14:54:10.878751 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="sg-core" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.878759 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="sg-core" Nov 29 14:54:10 crc kubenswrapper[4907]: E1129 14:54:10.878810 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="proxy-httpd" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.878819 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="proxy-httpd" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.879096 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="ceilometer-central-agent" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.879137 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd2b6b3d-9fd8-4bb7-bd36-522fe41370e3" containerName="dnsmasq-dns" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.879158 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="proxy-httpd" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.879177 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="ceilometer-notification-agent" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.879199 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" containerName="sg-core" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.881923 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.883967 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.884019 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.891614 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.950276 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.950325 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-scripts\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.950356 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrhnh\" (UniqueName: \"kubernetes.io/projected/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-kube-api-access-mrhnh\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.950389 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-log-httpd\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.950406 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-config-data\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.950468 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:10 crc kubenswrapper[4907]: I1129 14:54:10.950516 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-run-httpd\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.052607 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-scripts\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.052672 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrhnh\" (UniqueName: \"kubernetes.io/projected/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-kube-api-access-mrhnh\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.052728 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-log-httpd\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.052754 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-config-data\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.052818 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.052885 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-run-httpd\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.053009 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.054026 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-log-httpd\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.055406 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-run-httpd\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.059670 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.060199 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-config-data\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.060351 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-scripts\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.061080 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.082654 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrhnh\" (UniqueName: \"kubernetes.io/projected/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-kube-api-access-mrhnh\") pod \"ceilometer-0\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.210574 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.523853 4907 generic.go:334] "Generic (PLEG): container finished" podID="5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6" containerID="5bccaa7969a6878d47cce9db04567f931f270aa24332b73632764fd38abdaf16" exitCode=0 Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.523991 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dhqrp" event={"ID":"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6","Type":"ContainerDied","Data":"5bccaa7969a6878d47cce9db04567f931f270aa24332b73632764fd38abdaf16"} Nov 29 14:54:11 crc kubenswrapper[4907]: I1129 14:54:11.760550 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:54:11 crc kubenswrapper[4907]: W1129 14:54:11.760703 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5f072fc_976b_4cc4_8c87_fe3c52ab9829.slice/crio-6b53c5eae9b91f0726ec0da1501e5d61dfa42e8f0706cf169bf535549ad281ce WatchSource:0}: Error finding container 6b53c5eae9b91f0726ec0da1501e5d61dfa42e8f0706cf169bf535549ad281ce: Status 404 returned error can't find the container with id 6b53c5eae9b91f0726ec0da1501e5d61dfa42e8f0706cf169bf535549ad281ce Nov 29 14:54:12 crc kubenswrapper[4907]: I1129 14:54:12.510191 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd91679-7be2-4f92-bc82-52f5592eef70" path="/var/lib/kubelet/pods/4cd91679-7be2-4f92-bc82-52f5592eef70/volumes" Nov 29 14:54:12 crc kubenswrapper[4907]: I1129 14:54:12.546298 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5f072fc-976b-4cc4-8c87-fe3c52ab9829","Type":"ContainerStarted","Data":"6b53c5eae9b91f0726ec0da1501e5d61dfa42e8f0706cf169bf535549ad281ce"} Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.101526 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.209692 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-combined-ca-bundle\") pod \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.209772 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-scripts\") pod \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.209940 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-config-data\") pod \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.210047 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj6zw\" (UniqueName: \"kubernetes.io/projected/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-kube-api-access-nj6zw\") pod \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\" (UID: \"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6\") " Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.217239 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-scripts" (OuterVolumeSpecName: "scripts") pod "5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6" (UID: "5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.238590 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-kube-api-access-nj6zw" (OuterVolumeSpecName: "kube-api-access-nj6zw") pod "5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6" (UID: "5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6"). InnerVolumeSpecName "kube-api-access-nj6zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.268415 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6" (UID: "5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.273623 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-config-data" (OuterVolumeSpecName: "config-data") pod "5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6" (UID: "5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.314358 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.314404 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.314418 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.314430 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj6zw\" (UniqueName: \"kubernetes.io/projected/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6-kube-api-access-nj6zw\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.564968 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dhqrp" Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.564966 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dhqrp" event={"ID":"5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6","Type":"ContainerDied","Data":"74d327ebd750119f34d5b3d7ad29a3b1b422fdd62e5abfb6e6bd4df0ddb0c134"} Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.565412 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74d327ebd750119f34d5b3d7ad29a3b1b422fdd62e5abfb6e6bd4df0ddb0c134" Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.581871 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5f072fc-976b-4cc4-8c87-fe3c52ab9829","Type":"ContainerStarted","Data":"060f1646219e88a30ef3d950bf8dce9220c9c32c39828668d24a082537d4d48a"} Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.581959 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5f072fc-976b-4cc4-8c87-fe3c52ab9829","Type":"ContainerStarted","Data":"2922cbdfbd770cb8ce916ff3a850b865569b2ca87ab34156c183a9bbc35df4f4"} Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.765063 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.765459 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f3e92204-42b7-473a-aca5-3c2f480f8601" containerName="nova-api-log" containerID="cri-o://b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c" gracePeriod=30 Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.765485 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f3e92204-42b7-473a-aca5-3c2f480f8601" containerName="nova-api-api" containerID="cri-o://4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce" gracePeriod=30 Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.785164 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.785388 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="66fa6e0b-4574-4377-ab95-8a8843ffde6d" containerName="nova-scheduler-scheduler" containerID="cri-o://8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332" gracePeriod=30 Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.807588 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.807825 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="08c24285-2f83-4e3b-8c8d-acdb01ed50ee" containerName="nova-metadata-log" containerID="cri-o://ffcd0560c45bbb716a32a3c4359a98ad1ffb65f5233081566accebe75cd9b152" gracePeriod=30 Nov 29 14:54:13 crc kubenswrapper[4907]: I1129 14:54:13.807964 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="08c24285-2f83-4e3b-8c8d-acdb01ed50ee" containerName="nova-metadata-metadata" containerID="cri-o://79477305c12e66f134f5b1d2b4913a0624b38583e397d6115b04f407de3ddcd5" gracePeriod=30 Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.547118 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.604951 4907 generic.go:334] "Generic (PLEG): container finished" podID="08c24285-2f83-4e3b-8c8d-acdb01ed50ee" containerID="ffcd0560c45bbb716a32a3c4359a98ad1ffb65f5233081566accebe75cd9b152" exitCode=143 Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.604999 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"08c24285-2f83-4e3b-8c8d-acdb01ed50ee","Type":"ContainerDied","Data":"ffcd0560c45bbb716a32a3c4359a98ad1ffb65f5233081566accebe75cd9b152"} Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.607288 4907 generic.go:334] "Generic (PLEG): container finished" podID="f3e92204-42b7-473a-aca5-3c2f480f8601" containerID="4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce" exitCode=0 Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.607321 4907 generic.go:334] "Generic (PLEG): container finished" podID="f3e92204-42b7-473a-aca5-3c2f480f8601" containerID="b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c" exitCode=143 Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.607396 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.607396 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3e92204-42b7-473a-aca5-3c2f480f8601","Type":"ContainerDied","Data":"4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce"} Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.607451 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3e92204-42b7-473a-aca5-3c2f480f8601","Type":"ContainerDied","Data":"b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c"} Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.607466 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3e92204-42b7-473a-aca5-3c2f480f8601","Type":"ContainerDied","Data":"05b6c4faf4352cc123212a198b960c187e88ca7180136870d419b847f59c4a8b"} Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.607485 4907 scope.go:117] "RemoveContainer" containerID="4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.612297 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5f072fc-976b-4cc4-8c87-fe3c52ab9829","Type":"ContainerStarted","Data":"33268e28702d657a5731d84b1cff388994ed874be71a0486c8fbd570a8f5ddf5"} Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.649525 4907 scope.go:117] "RemoveContainer" containerID="b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.652341 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-internal-tls-certs\") pod \"f3e92204-42b7-473a-aca5-3c2f480f8601\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.652487 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-config-data\") pod \"f3e92204-42b7-473a-aca5-3c2f480f8601\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.652565 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e92204-42b7-473a-aca5-3c2f480f8601-logs\") pod \"f3e92204-42b7-473a-aca5-3c2f480f8601\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.652616 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-combined-ca-bundle\") pod \"f3e92204-42b7-473a-aca5-3c2f480f8601\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.652671 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-public-tls-certs\") pod \"f3e92204-42b7-473a-aca5-3c2f480f8601\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.653881 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79ndh\" (UniqueName: \"kubernetes.io/projected/f3e92204-42b7-473a-aca5-3c2f480f8601-kube-api-access-79ndh\") pod \"f3e92204-42b7-473a-aca5-3c2f480f8601\" (UID: \"f3e92204-42b7-473a-aca5-3c2f480f8601\") " Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.654348 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3e92204-42b7-473a-aca5-3c2f480f8601-logs" (OuterVolumeSpecName: "logs") pod "f3e92204-42b7-473a-aca5-3c2f480f8601" (UID: "f3e92204-42b7-473a-aca5-3c2f480f8601"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.654666 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e92204-42b7-473a-aca5-3c2f480f8601-logs\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.660783 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3e92204-42b7-473a-aca5-3c2f480f8601-kube-api-access-79ndh" (OuterVolumeSpecName: "kube-api-access-79ndh") pod "f3e92204-42b7-473a-aca5-3c2f480f8601" (UID: "f3e92204-42b7-473a-aca5-3c2f480f8601"). InnerVolumeSpecName "kube-api-access-79ndh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.679029 4907 scope.go:117] "RemoveContainer" containerID="4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce" Nov 29 14:54:14 crc kubenswrapper[4907]: E1129 14:54:14.680629 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce\": container with ID starting with 4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce not found: ID does not exist" containerID="4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.680664 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce"} err="failed to get container status \"4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce\": rpc error: code = NotFound desc = could not find container \"4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce\": container with ID starting with 4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce not found: ID does not exist" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.680684 4907 scope.go:117] "RemoveContainer" containerID="b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c" Nov 29 14:54:14 crc kubenswrapper[4907]: E1129 14:54:14.681766 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c\": container with ID starting with b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c not found: ID does not exist" containerID="b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.681792 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c"} err="failed to get container status \"b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c\": rpc error: code = NotFound desc = could not find container \"b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c\": container with ID starting with b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c not found: ID does not exist" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.681806 4907 scope.go:117] "RemoveContainer" containerID="4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.682112 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce"} err="failed to get container status \"4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce\": rpc error: code = NotFound desc = could not find container \"4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce\": container with ID starting with 4fa9a5777479491fbd35b5f7e00ddf6b345d4adc0196c5b839b8a37ac955f7ce not found: ID does not exist" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.682127 4907 scope.go:117] "RemoveContainer" containerID="b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.682870 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c"} err="failed to get container status \"b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c\": rpc error: code = NotFound desc = could not find container \"b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c\": container with ID starting with b83782f065f1a78ae345bfd1b5e29afbc01eec2c504edc29d1fa2071cb770d0c not found: ID does not exist" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.688677 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3e92204-42b7-473a-aca5-3c2f480f8601" (UID: "f3e92204-42b7-473a-aca5-3c2f480f8601"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.718035 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-config-data" (OuterVolumeSpecName: "config-data") pod "f3e92204-42b7-473a-aca5-3c2f480f8601" (UID: "f3e92204-42b7-473a-aca5-3c2f480f8601"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.730523 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f3e92204-42b7-473a-aca5-3c2f480f8601" (UID: "f3e92204-42b7-473a-aca5-3c2f480f8601"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.736025 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f3e92204-42b7-473a-aca5-3c2f480f8601" (UID: "f3e92204-42b7-473a-aca5-3c2f480f8601"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.757572 4907 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.757807 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.757904 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.757995 4907 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e92204-42b7-473a-aca5-3c2f480f8601-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.758085 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79ndh\" (UniqueName: \"kubernetes.io/projected/f3e92204-42b7-473a-aca5-3c2f480f8601-kube-api-access-79ndh\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.940089 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.955053 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.970950 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 14:54:14 crc kubenswrapper[4907]: E1129 14:54:14.971840 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6" containerName="nova-manage" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.971864 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6" containerName="nova-manage" Nov 29 14:54:14 crc kubenswrapper[4907]: E1129 14:54:14.971888 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e92204-42b7-473a-aca5-3c2f480f8601" containerName="nova-api-api" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.971897 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e92204-42b7-473a-aca5-3c2f480f8601" containerName="nova-api-api" Nov 29 14:54:14 crc kubenswrapper[4907]: E1129 14:54:14.971915 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e92204-42b7-473a-aca5-3c2f480f8601" containerName="nova-api-log" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.971923 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e92204-42b7-473a-aca5-3c2f480f8601" containerName="nova-api-log" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.972145 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3e92204-42b7-473a-aca5-3c2f480f8601" containerName="nova-api-log" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.972166 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6" containerName="nova-manage" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.972196 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3e92204-42b7-473a-aca5-3c2f480f8601" containerName="nova-api-api" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.973691 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.985685 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.986319 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 29 14:54:14 crc kubenswrapper[4907]: I1129 14:54:14.986459 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.006891 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.063861 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-logs\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.063926 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-config-data\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.064020 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlssm\" (UniqueName: \"kubernetes.io/projected/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-kube-api-access-hlssm\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.064045 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-public-tls-certs\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.064101 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.064116 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.165642 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlssm\" (UniqueName: \"kubernetes.io/projected/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-kube-api-access-hlssm\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.165685 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-public-tls-certs\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.165744 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.165764 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.165839 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-logs\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.165862 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-config-data\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.166423 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-logs\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.170968 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.171206 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.171338 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-config-data\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.173164 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-public-tls-certs\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.195512 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlssm\" (UniqueName: \"kubernetes.io/projected/3bf63a4b-e850-4b1c-b7ea-00bf87d3d125-kube-api-access-hlssm\") pod \"nova-api-0\" (UID: \"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125\") " pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.298255 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.633237 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5f072fc-976b-4cc4-8c87-fe3c52ab9829","Type":"ContainerStarted","Data":"0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f"} Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.633968 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.806991 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.201419651 podStartE2EDuration="5.806969869s" podCreationTimestamp="2025-11-29 14:54:10 +0000 UTC" firstStartedPulling="2025-11-29 14:54:11.764238415 +0000 UTC m=+1549.751076067" lastFinishedPulling="2025-11-29 14:54:15.369788623 +0000 UTC m=+1553.356626285" observedRunningTime="2025-11-29 14:54:15.660027676 +0000 UTC m=+1553.646865318" watchObservedRunningTime="2025-11-29 14:54:15.806969869 +0000 UTC m=+1553.793807521" Nov 29 14:54:15 crc kubenswrapper[4907]: I1129 14:54:15.808091 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 14:54:15 crc kubenswrapper[4907]: W1129 14:54:15.817379 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bf63a4b_e850_4b1c_b7ea_00bf87d3d125.slice/crio-1b8f7a45cc00d9142c95d5282497ecf3ad613c142784030428456c50220dfbbd WatchSource:0}: Error finding container 1b8f7a45cc00d9142c95d5282497ecf3ad613c142784030428456c50220dfbbd: Status 404 returned error can't find the container with id 1b8f7a45cc00d9142c95d5282497ecf3ad613c142784030428456c50220dfbbd Nov 29 14:54:16 crc kubenswrapper[4907]: E1129 14:54:16.061642 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332 is running failed: container process not found" containerID="8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 14:54:16 crc kubenswrapper[4907]: E1129 14:54:16.067274 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332 is running failed: container process not found" containerID="8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 14:54:16 crc kubenswrapper[4907]: E1129 14:54:16.071551 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332 is running failed: container process not found" containerID="8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 14:54:16 crc kubenswrapper[4907]: E1129 14:54:16.071610 4907 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="66fa6e0b-4574-4377-ab95-8a8843ffde6d" containerName="nova-scheduler-scheduler" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.339062 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.397698 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v4bf\" (UniqueName: \"kubernetes.io/projected/66fa6e0b-4574-4377-ab95-8a8843ffde6d-kube-api-access-5v4bf\") pod \"66fa6e0b-4574-4377-ab95-8a8843ffde6d\" (UID: \"66fa6e0b-4574-4377-ab95-8a8843ffde6d\") " Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.398574 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66fa6e0b-4574-4377-ab95-8a8843ffde6d-combined-ca-bundle\") pod \"66fa6e0b-4574-4377-ab95-8a8843ffde6d\" (UID: \"66fa6e0b-4574-4377-ab95-8a8843ffde6d\") " Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.398723 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66fa6e0b-4574-4377-ab95-8a8843ffde6d-config-data\") pod \"66fa6e0b-4574-4377-ab95-8a8843ffde6d\" (UID: \"66fa6e0b-4574-4377-ab95-8a8843ffde6d\") " Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.414868 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66fa6e0b-4574-4377-ab95-8a8843ffde6d-kube-api-access-5v4bf" (OuterVolumeSpecName: "kube-api-access-5v4bf") pod "66fa6e0b-4574-4377-ab95-8a8843ffde6d" (UID: "66fa6e0b-4574-4377-ab95-8a8843ffde6d"). InnerVolumeSpecName "kube-api-access-5v4bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.428548 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66fa6e0b-4574-4377-ab95-8a8843ffde6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66fa6e0b-4574-4377-ab95-8a8843ffde6d" (UID: "66fa6e0b-4574-4377-ab95-8a8843ffde6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.433088 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66fa6e0b-4574-4377-ab95-8a8843ffde6d-config-data" (OuterVolumeSpecName: "config-data") pod "66fa6e0b-4574-4377-ab95-8a8843ffde6d" (UID: "66fa6e0b-4574-4377-ab95-8a8843ffde6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.492346 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3e92204-42b7-473a-aca5-3c2f480f8601" path="/var/lib/kubelet/pods/f3e92204-42b7-473a-aca5-3c2f480f8601/volumes" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.518057 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v4bf\" (UniqueName: \"kubernetes.io/projected/66fa6e0b-4574-4377-ab95-8a8843ffde6d-kube-api-access-5v4bf\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.518127 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66fa6e0b-4574-4377-ab95-8a8843ffde6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.518141 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66fa6e0b-4574-4377-ab95-8a8843ffde6d-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.649263 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125","Type":"ContainerStarted","Data":"3b8c4227bf8e87cb946a76d4ae4bd9c1de8fa39bb56494660bbf3f180bb9234c"} Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.649319 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125","Type":"ContainerStarted","Data":"103d99b8f45fb6776a0cc095a8db9e2b61aa4519b37c3b925e66f90ff92d1c8a"} Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.649332 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3bf63a4b-e850-4b1c-b7ea-00bf87d3d125","Type":"ContainerStarted","Data":"1b8f7a45cc00d9142c95d5282497ecf3ad613c142784030428456c50220dfbbd"} Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.651573 4907 generic.go:334] "Generic (PLEG): container finished" podID="66fa6e0b-4574-4377-ab95-8a8843ffde6d" containerID="8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332" exitCode=0 Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.651643 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.651689 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"66fa6e0b-4574-4377-ab95-8a8843ffde6d","Type":"ContainerDied","Data":"8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332"} Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.651758 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"66fa6e0b-4574-4377-ab95-8a8843ffde6d","Type":"ContainerDied","Data":"27fcdf6c27bda289b87cd0e1baba1c1004ae62ebd0445fa5ea879f6d10b38a6f"} Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.651789 4907 scope.go:117] "RemoveContainer" containerID="8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.685865 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.685842928 podStartE2EDuration="2.685842928s" podCreationTimestamp="2025-11-29 14:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:54:16.676105923 +0000 UTC m=+1554.662943585" watchObservedRunningTime="2025-11-29 14:54:16.685842928 +0000 UTC m=+1554.672680580" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.693282 4907 scope.go:117] "RemoveContainer" containerID="8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332" Nov 29 14:54:16 crc kubenswrapper[4907]: E1129 14:54:16.693757 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332\": container with ID starting with 8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332 not found: ID does not exist" containerID="8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.693796 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332"} err="failed to get container status \"8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332\": rpc error: code = NotFound desc = could not find container \"8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332\": container with ID starting with 8d1aa3ffe22a78f14ad9a0b8b45a458c5b2a9c2afdb6c478663ad0645d5dd332 not found: ID does not exist" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.710652 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.729332 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.743492 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 14:54:16 crc kubenswrapper[4907]: E1129 14:54:16.744232 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66fa6e0b-4574-4377-ab95-8a8843ffde6d" containerName="nova-scheduler-scheduler" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.744321 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="66fa6e0b-4574-4377-ab95-8a8843ffde6d" containerName="nova-scheduler-scheduler" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.744689 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="66fa6e0b-4574-4377-ab95-8a8843ffde6d" containerName="nova-scheduler-scheduler" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.745655 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.748818 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.775664 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.825655 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz4x9\" (UniqueName: \"kubernetes.io/projected/ad18f82a-09a8-4a8c-ae4b-677fa5dd280d-kube-api-access-lz4x9\") pod \"nova-scheduler-0\" (UID: \"ad18f82a-09a8-4a8c-ae4b-677fa5dd280d\") " pod="openstack/nova-scheduler-0" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.825873 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad18f82a-09a8-4a8c-ae4b-677fa5dd280d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ad18f82a-09a8-4a8c-ae4b-677fa5dd280d\") " pod="openstack/nova-scheduler-0" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.826004 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad18f82a-09a8-4a8c-ae4b-677fa5dd280d-config-data\") pod \"nova-scheduler-0\" (UID: \"ad18f82a-09a8-4a8c-ae4b-677fa5dd280d\") " pod="openstack/nova-scheduler-0" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.928159 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz4x9\" (UniqueName: \"kubernetes.io/projected/ad18f82a-09a8-4a8c-ae4b-677fa5dd280d-kube-api-access-lz4x9\") pod \"nova-scheduler-0\" (UID: \"ad18f82a-09a8-4a8c-ae4b-677fa5dd280d\") " pod="openstack/nova-scheduler-0" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.928629 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad18f82a-09a8-4a8c-ae4b-677fa5dd280d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ad18f82a-09a8-4a8c-ae4b-677fa5dd280d\") " pod="openstack/nova-scheduler-0" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.928761 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad18f82a-09a8-4a8c-ae4b-677fa5dd280d-config-data\") pod \"nova-scheduler-0\" (UID: \"ad18f82a-09a8-4a8c-ae4b-677fa5dd280d\") " pod="openstack/nova-scheduler-0" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.934287 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad18f82a-09a8-4a8c-ae4b-677fa5dd280d-config-data\") pod \"nova-scheduler-0\" (UID: \"ad18f82a-09a8-4a8c-ae4b-677fa5dd280d\") " pod="openstack/nova-scheduler-0" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.934547 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad18f82a-09a8-4a8c-ae4b-677fa5dd280d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ad18f82a-09a8-4a8c-ae4b-677fa5dd280d\") " pod="openstack/nova-scheduler-0" Nov 29 14:54:16 crc kubenswrapper[4907]: I1129 14:54:16.947063 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz4x9\" (UniqueName: \"kubernetes.io/projected/ad18f82a-09a8-4a8c-ae4b-677fa5dd280d-kube-api-access-lz4x9\") pod \"nova-scheduler-0\" (UID: \"ad18f82a-09a8-4a8c-ae4b-677fa5dd280d\") " pod="openstack/nova-scheduler-0" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.076807 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.622495 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.674638 4907 generic.go:334] "Generic (PLEG): container finished" podID="08c24285-2f83-4e3b-8c8d-acdb01ed50ee" containerID="79477305c12e66f134f5b1d2b4913a0624b38583e397d6115b04f407de3ddcd5" exitCode=0 Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.675780 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.676288 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"08c24285-2f83-4e3b-8c8d-acdb01ed50ee","Type":"ContainerDied","Data":"79477305c12e66f134f5b1d2b4913a0624b38583e397d6115b04f407de3ddcd5"} Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.676315 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"08c24285-2f83-4e3b-8c8d-acdb01ed50ee","Type":"ContainerDied","Data":"563ba47f2ebd4876340b2883c24e0187454b525432811bff02781c397ab3be34"} Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.676333 4907 scope.go:117] "RemoveContainer" containerID="79477305c12e66f134f5b1d2b4913a0624b38583e397d6115b04f407de3ddcd5" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.703219 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.713509 4907 scope.go:117] "RemoveContainer" containerID="ffcd0560c45bbb716a32a3c4359a98ad1ffb65f5233081566accebe75cd9b152" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.748356 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-combined-ca-bundle\") pod \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.748577 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-config-data\") pod \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.748760 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfkb6\" (UniqueName: \"kubernetes.io/projected/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-kube-api-access-sfkb6\") pod \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.748796 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-logs\") pod \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.748827 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-nova-metadata-tls-certs\") pod \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\" (UID: \"08c24285-2f83-4e3b-8c8d-acdb01ed50ee\") " Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.751234 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-logs" (OuterVolumeSpecName: "logs") pod "08c24285-2f83-4e3b-8c8d-acdb01ed50ee" (UID: "08c24285-2f83-4e3b-8c8d-acdb01ed50ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.753954 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-kube-api-access-sfkb6" (OuterVolumeSpecName: "kube-api-access-sfkb6") pod "08c24285-2f83-4e3b-8c8d-acdb01ed50ee" (UID: "08c24285-2f83-4e3b-8c8d-acdb01ed50ee"). InnerVolumeSpecName "kube-api-access-sfkb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.758099 4907 scope.go:117] "RemoveContainer" containerID="79477305c12e66f134f5b1d2b4913a0624b38583e397d6115b04f407de3ddcd5" Nov 29 14:54:17 crc kubenswrapper[4907]: E1129 14:54:17.758859 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79477305c12e66f134f5b1d2b4913a0624b38583e397d6115b04f407de3ddcd5\": container with ID starting with 79477305c12e66f134f5b1d2b4913a0624b38583e397d6115b04f407de3ddcd5 not found: ID does not exist" containerID="79477305c12e66f134f5b1d2b4913a0624b38583e397d6115b04f407de3ddcd5" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.758908 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79477305c12e66f134f5b1d2b4913a0624b38583e397d6115b04f407de3ddcd5"} err="failed to get container status \"79477305c12e66f134f5b1d2b4913a0624b38583e397d6115b04f407de3ddcd5\": rpc error: code = NotFound desc = could not find container \"79477305c12e66f134f5b1d2b4913a0624b38583e397d6115b04f407de3ddcd5\": container with ID starting with 79477305c12e66f134f5b1d2b4913a0624b38583e397d6115b04f407de3ddcd5 not found: ID does not exist" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.758941 4907 scope.go:117] "RemoveContainer" containerID="ffcd0560c45bbb716a32a3c4359a98ad1ffb65f5233081566accebe75cd9b152" Nov 29 14:54:17 crc kubenswrapper[4907]: E1129 14:54:17.759580 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffcd0560c45bbb716a32a3c4359a98ad1ffb65f5233081566accebe75cd9b152\": container with ID starting with ffcd0560c45bbb716a32a3c4359a98ad1ffb65f5233081566accebe75cd9b152 not found: ID does not exist" containerID="ffcd0560c45bbb716a32a3c4359a98ad1ffb65f5233081566accebe75cd9b152" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.759624 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffcd0560c45bbb716a32a3c4359a98ad1ffb65f5233081566accebe75cd9b152"} err="failed to get container status \"ffcd0560c45bbb716a32a3c4359a98ad1ffb65f5233081566accebe75cd9b152\": rpc error: code = NotFound desc = could not find container \"ffcd0560c45bbb716a32a3c4359a98ad1ffb65f5233081566accebe75cd9b152\": container with ID starting with ffcd0560c45bbb716a32a3c4359a98ad1ffb65f5233081566accebe75cd9b152 not found: ID does not exist" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.783566 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08c24285-2f83-4e3b-8c8d-acdb01ed50ee" (UID: "08c24285-2f83-4e3b-8c8d-acdb01ed50ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.787392 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-config-data" (OuterVolumeSpecName: "config-data") pod "08c24285-2f83-4e3b-8c8d-acdb01ed50ee" (UID: "08c24285-2f83-4e3b-8c8d-acdb01ed50ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.812726 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "08c24285-2f83-4e3b-8c8d-acdb01ed50ee" (UID: "08c24285-2f83-4e3b-8c8d-acdb01ed50ee"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.861895 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.861942 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfkb6\" (UniqueName: \"kubernetes.io/projected/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-kube-api-access-sfkb6\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.861954 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-logs\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.861963 4907 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:17 crc kubenswrapper[4907]: I1129 14:54:17.861972 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c24285-2f83-4e3b-8c8d-acdb01ed50ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.012869 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.023497 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.041692 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:54:18 crc kubenswrapper[4907]: E1129 14:54:18.042104 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c24285-2f83-4e3b-8c8d-acdb01ed50ee" containerName="nova-metadata-metadata" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.042122 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c24285-2f83-4e3b-8c8d-acdb01ed50ee" containerName="nova-metadata-metadata" Nov 29 14:54:18 crc kubenswrapper[4907]: E1129 14:54:18.042152 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c24285-2f83-4e3b-8c8d-acdb01ed50ee" containerName="nova-metadata-log" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.042158 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c24285-2f83-4e3b-8c8d-acdb01ed50ee" containerName="nova-metadata-log" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.042747 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c24285-2f83-4e3b-8c8d-acdb01ed50ee" containerName="nova-metadata-metadata" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.042791 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c24285-2f83-4e3b-8c8d-acdb01ed50ee" containerName="nova-metadata-log" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.045301 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.058145 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.058205 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.059450 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.167983 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e651a72-97da-438c-9791-42506da10f6f-config-data\") pod \"nova-metadata-0\" (UID: \"8e651a72-97da-438c-9791-42506da10f6f\") " pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.168046 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e651a72-97da-438c-9791-42506da10f6f-logs\") pod \"nova-metadata-0\" (UID: \"8e651a72-97da-438c-9791-42506da10f6f\") " pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.168148 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e651a72-97da-438c-9791-42506da10f6f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8e651a72-97da-438c-9791-42506da10f6f\") " pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.168302 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e651a72-97da-438c-9791-42506da10f6f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8e651a72-97da-438c-9791-42506da10f6f\") " pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.168362 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm8pq\" (UniqueName: \"kubernetes.io/projected/8e651a72-97da-438c-9791-42506da10f6f-kube-api-access-vm8pq\") pod \"nova-metadata-0\" (UID: \"8e651a72-97da-438c-9791-42506da10f6f\") " pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.270902 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e651a72-97da-438c-9791-42506da10f6f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8e651a72-97da-438c-9791-42506da10f6f\") " pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.271012 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm8pq\" (UniqueName: \"kubernetes.io/projected/8e651a72-97da-438c-9791-42506da10f6f-kube-api-access-vm8pq\") pod \"nova-metadata-0\" (UID: \"8e651a72-97da-438c-9791-42506da10f6f\") " pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.271115 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e651a72-97da-438c-9791-42506da10f6f-config-data\") pod \"nova-metadata-0\" (UID: \"8e651a72-97da-438c-9791-42506da10f6f\") " pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.271162 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e651a72-97da-438c-9791-42506da10f6f-logs\") pod \"nova-metadata-0\" (UID: \"8e651a72-97da-438c-9791-42506da10f6f\") " pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.271581 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e651a72-97da-438c-9791-42506da10f6f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8e651a72-97da-438c-9791-42506da10f6f\") " pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.271889 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e651a72-97da-438c-9791-42506da10f6f-logs\") pod \"nova-metadata-0\" (UID: \"8e651a72-97da-438c-9791-42506da10f6f\") " pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.276530 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e651a72-97da-438c-9791-42506da10f6f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8e651a72-97da-438c-9791-42506da10f6f\") " pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.276823 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e651a72-97da-438c-9791-42506da10f6f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8e651a72-97da-438c-9791-42506da10f6f\") " pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.292390 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e651a72-97da-438c-9791-42506da10f6f-config-data\") pod \"nova-metadata-0\" (UID: \"8e651a72-97da-438c-9791-42506da10f6f\") " pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.292831 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm8pq\" (UniqueName: \"kubernetes.io/projected/8e651a72-97da-438c-9791-42506da10f6f-kube-api-access-vm8pq\") pod \"nova-metadata-0\" (UID: \"8e651a72-97da-438c-9791-42506da10f6f\") " pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.431184 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.496637 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c24285-2f83-4e3b-8c8d-acdb01ed50ee" path="/var/lib/kubelet/pods/08c24285-2f83-4e3b-8c8d-acdb01ed50ee/volumes" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.497873 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66fa6e0b-4574-4377-ab95-8a8843ffde6d" path="/var/lib/kubelet/pods/66fa6e0b-4574-4377-ab95-8a8843ffde6d/volumes" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.687329 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad18f82a-09a8-4a8c-ae4b-677fa5dd280d","Type":"ContainerStarted","Data":"ee031079d8cf0f0eeed6579495365f247bb51992312a456e4ec5b6f75df718fd"} Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.687680 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad18f82a-09a8-4a8c-ae4b-677fa5dd280d","Type":"ContainerStarted","Data":"6182700003f06d73ba8e004d07a29195db3f664b98c5530bcdbb6a9ee92560eb"} Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.711169 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.711146362 podStartE2EDuration="2.711146362s" podCreationTimestamp="2025-11-29 14:54:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:54:18.699552725 +0000 UTC m=+1556.686390367" watchObservedRunningTime="2025-11-29 14:54:18.711146362 +0000 UTC m=+1556.697984014" Nov 29 14:54:18 crc kubenswrapper[4907]: I1129 14:54:18.882609 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 14:54:18 crc kubenswrapper[4907]: W1129 14:54:18.892663 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e651a72_97da_438c_9791_42506da10f6f.slice/crio-7bbddf29c4a5496d28ca05b80443b53df6b461a43b2682d9915483c2c8a5438e WatchSource:0}: Error finding container 7bbddf29c4a5496d28ca05b80443b53df6b461a43b2682d9915483c2c8a5438e: Status 404 returned error can't find the container with id 7bbddf29c4a5496d28ca05b80443b53df6b461a43b2682d9915483c2c8a5438e Nov 29 14:54:19 crc kubenswrapper[4907]: I1129 14:54:19.705574 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e651a72-97da-438c-9791-42506da10f6f","Type":"ContainerStarted","Data":"b5cc14029b096d7cd3403f52bf7c8c377ecabfe8065aee5307c5dbe6005bd348"} Nov 29 14:54:19 crc kubenswrapper[4907]: I1129 14:54:19.705969 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e651a72-97da-438c-9791-42506da10f6f","Type":"ContainerStarted","Data":"b4979e86f29bd63ea4e440beb1f43e67913d05753efc8ae65ee30d14ff03ad7f"} Nov 29 14:54:19 crc kubenswrapper[4907]: I1129 14:54:19.705994 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e651a72-97da-438c-9791-42506da10f6f","Type":"ContainerStarted","Data":"7bbddf29c4a5496d28ca05b80443b53df6b461a43b2682d9915483c2c8a5438e"} Nov 29 14:54:22 crc kubenswrapper[4907]: I1129 14:54:22.077579 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 29 14:54:22 crc kubenswrapper[4907]: I1129 14:54:22.307057 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="08c24285-2f83-4e3b-8c8d-acdb01ed50ee" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.243:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 29 14:54:22 crc kubenswrapper[4907]: I1129 14:54:22.307526 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="08c24285-2f83-4e3b-8c8d-acdb01ed50ee" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.243:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 29 14:54:23 crc kubenswrapper[4907]: I1129 14:54:23.466627 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 14:54:23 crc kubenswrapper[4907]: I1129 14:54:23.467035 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 14:54:25 crc kubenswrapper[4907]: I1129 14:54:25.299555 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 14:54:25 crc kubenswrapper[4907]: I1129 14:54:25.299920 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 14:54:26 crc kubenswrapper[4907]: I1129 14:54:26.317687 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3bf63a4b-e850-4b1c-b7ea-00bf87d3d125" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.254:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 14:54:26 crc kubenswrapper[4907]: I1129 14:54:26.317717 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3bf63a4b-e850-4b1c-b7ea-00bf87d3d125" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.254:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 14:54:27 crc kubenswrapper[4907]: I1129 14:54:27.077319 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 29 14:54:27 crc kubenswrapper[4907]: I1129 14:54:27.137285 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 29 14:54:27 crc kubenswrapper[4907]: I1129 14:54:27.162307 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=9.162283009 podStartE2EDuration="9.162283009s" podCreationTimestamp="2025-11-29 14:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:54:19.732864618 +0000 UTC m=+1557.719702270" watchObservedRunningTime="2025-11-29 14:54:27.162283009 +0000 UTC m=+1565.149120691" Nov 29 14:54:27 crc kubenswrapper[4907]: I1129 14:54:27.885401 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 29 14:54:28 crc kubenswrapper[4907]: I1129 14:54:28.432196 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 14:54:28 crc kubenswrapper[4907]: I1129 14:54:28.432274 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 14:54:28 crc kubenswrapper[4907]: I1129 14:54:28.490674 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:54:28 crc kubenswrapper[4907]: I1129 14:54:28.490747 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:54:29 crc kubenswrapper[4907]: I1129 14:54:29.452696 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8e651a72-97da-438c-9791-42506da10f6f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.0:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 14:54:29 crc kubenswrapper[4907]: I1129 14:54:29.452713 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8e651a72-97da-438c-9791-42506da10f6f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.0:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 14:54:33 crc kubenswrapper[4907]: I1129 14:54:33.996942 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 29 14:54:33 crc kubenswrapper[4907]: I1129 14:54:33.997705 4907 generic.go:334] "Generic (PLEG): container finished" podID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerID="df6ea71164202b256f79c9de504d77a27c8f1a9bfdb68780a1b3718ef14a71ad" exitCode=137 Nov 29 14:54:33 crc kubenswrapper[4907]: I1129 14:54:33.997745 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d163ea39-da39-4c8c-8a6a-552e55751e61","Type":"ContainerDied","Data":"df6ea71164202b256f79c9de504d77a27c8f1a9bfdb68780a1b3718ef14a71ad"} Nov 29 14:54:33 crc kubenswrapper[4907]: I1129 14:54:33.997772 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d163ea39-da39-4c8c-8a6a-552e55751e61","Type":"ContainerDied","Data":"17c8f1c1396fe71a6a038e743b999c7422b332781b7f2ce265837e1c7ec34b95"} Nov 29 14:54:33 crc kubenswrapper[4907]: I1129 14:54:33.997789 4907 scope.go:117] "RemoveContainer" containerID="df6ea71164202b256f79c9de504d77a27c8f1a9bfdb68780a1b3718ef14a71ad" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.029087 4907 scope.go:117] "RemoveContainer" containerID="318318d367a07d639eeaed76fbc21cf04a5e2a945a46efe1c1f75d3eb3913e78" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.039842 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-combined-ca-bundle\") pod \"d163ea39-da39-4c8c-8a6a-552e55751e61\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.040024 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxqzw\" (UniqueName: \"kubernetes.io/projected/d163ea39-da39-4c8c-8a6a-552e55751e61-kube-api-access-xxqzw\") pod \"d163ea39-da39-4c8c-8a6a-552e55751e61\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.040199 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-config-data\") pod \"d163ea39-da39-4c8c-8a6a-552e55751e61\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.040358 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-scripts\") pod \"d163ea39-da39-4c8c-8a6a-552e55751e61\" (UID: \"d163ea39-da39-4c8c-8a6a-552e55751e61\") " Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.049244 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-scripts" (OuterVolumeSpecName: "scripts") pod "d163ea39-da39-4c8c-8a6a-552e55751e61" (UID: "d163ea39-da39-4c8c-8a6a-552e55751e61"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.066533 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d163ea39-da39-4c8c-8a6a-552e55751e61-kube-api-access-xxqzw" (OuterVolumeSpecName: "kube-api-access-xxqzw") pod "d163ea39-da39-4c8c-8a6a-552e55751e61" (UID: "d163ea39-da39-4c8c-8a6a-552e55751e61"). InnerVolumeSpecName "kube-api-access-xxqzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.089927 4907 scope.go:117] "RemoveContainer" containerID="d85a391edaab5d0be88d3300e24d32dc745c97fc6d3a68defaa6068bbba9a714" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.143915 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.144156 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxqzw\" (UniqueName: \"kubernetes.io/projected/d163ea39-da39-4c8c-8a6a-552e55751e61-kube-api-access-xxqzw\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.180124 4907 scope.go:117] "RemoveContainer" containerID="b9bb7292f26b129ba3592868fbff2feb623b014dc4436a0b44016f0a0fa4bba9" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.203488 4907 scope.go:117] "RemoveContainer" containerID="df6ea71164202b256f79c9de504d77a27c8f1a9bfdb68780a1b3718ef14a71ad" Nov 29 14:54:34 crc kubenswrapper[4907]: E1129 14:54:34.203919 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df6ea71164202b256f79c9de504d77a27c8f1a9bfdb68780a1b3718ef14a71ad\": container with ID starting with df6ea71164202b256f79c9de504d77a27c8f1a9bfdb68780a1b3718ef14a71ad not found: ID does not exist" containerID="df6ea71164202b256f79c9de504d77a27c8f1a9bfdb68780a1b3718ef14a71ad" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.203982 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df6ea71164202b256f79c9de504d77a27c8f1a9bfdb68780a1b3718ef14a71ad"} err="failed to get container status \"df6ea71164202b256f79c9de504d77a27c8f1a9bfdb68780a1b3718ef14a71ad\": rpc error: code = NotFound desc = could not find container \"df6ea71164202b256f79c9de504d77a27c8f1a9bfdb68780a1b3718ef14a71ad\": container with ID starting with df6ea71164202b256f79c9de504d77a27c8f1a9bfdb68780a1b3718ef14a71ad not found: ID does not exist" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.204234 4907 scope.go:117] "RemoveContainer" containerID="318318d367a07d639eeaed76fbc21cf04a5e2a945a46efe1c1f75d3eb3913e78" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.204599 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-config-data" (OuterVolumeSpecName: "config-data") pod "d163ea39-da39-4c8c-8a6a-552e55751e61" (UID: "d163ea39-da39-4c8c-8a6a-552e55751e61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:34 crc kubenswrapper[4907]: E1129 14:54:34.205004 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"318318d367a07d639eeaed76fbc21cf04a5e2a945a46efe1c1f75d3eb3913e78\": container with ID starting with 318318d367a07d639eeaed76fbc21cf04a5e2a945a46efe1c1f75d3eb3913e78 not found: ID does not exist" containerID="318318d367a07d639eeaed76fbc21cf04a5e2a945a46efe1c1f75d3eb3913e78" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.205037 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"318318d367a07d639eeaed76fbc21cf04a5e2a945a46efe1c1f75d3eb3913e78"} err="failed to get container status \"318318d367a07d639eeaed76fbc21cf04a5e2a945a46efe1c1f75d3eb3913e78\": rpc error: code = NotFound desc = could not find container \"318318d367a07d639eeaed76fbc21cf04a5e2a945a46efe1c1f75d3eb3913e78\": container with ID starting with 318318d367a07d639eeaed76fbc21cf04a5e2a945a46efe1c1f75d3eb3913e78 not found: ID does not exist" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.205065 4907 scope.go:117] "RemoveContainer" containerID="d85a391edaab5d0be88d3300e24d32dc745c97fc6d3a68defaa6068bbba9a714" Nov 29 14:54:34 crc kubenswrapper[4907]: E1129 14:54:34.205399 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d85a391edaab5d0be88d3300e24d32dc745c97fc6d3a68defaa6068bbba9a714\": container with ID starting with d85a391edaab5d0be88d3300e24d32dc745c97fc6d3a68defaa6068bbba9a714 not found: ID does not exist" containerID="d85a391edaab5d0be88d3300e24d32dc745c97fc6d3a68defaa6068bbba9a714" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.205474 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85a391edaab5d0be88d3300e24d32dc745c97fc6d3a68defaa6068bbba9a714"} err="failed to get container status \"d85a391edaab5d0be88d3300e24d32dc745c97fc6d3a68defaa6068bbba9a714\": rpc error: code = NotFound desc = could not find container \"d85a391edaab5d0be88d3300e24d32dc745c97fc6d3a68defaa6068bbba9a714\": container with ID starting with d85a391edaab5d0be88d3300e24d32dc745c97fc6d3a68defaa6068bbba9a714 not found: ID does not exist" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.205502 4907 scope.go:117] "RemoveContainer" containerID="b9bb7292f26b129ba3592868fbff2feb623b014dc4436a0b44016f0a0fa4bba9" Nov 29 14:54:34 crc kubenswrapper[4907]: E1129 14:54:34.205984 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9bb7292f26b129ba3592868fbff2feb623b014dc4436a0b44016f0a0fa4bba9\": container with ID starting with b9bb7292f26b129ba3592868fbff2feb623b014dc4436a0b44016f0a0fa4bba9 not found: ID does not exist" containerID="b9bb7292f26b129ba3592868fbff2feb623b014dc4436a0b44016f0a0fa4bba9" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.206021 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9bb7292f26b129ba3592868fbff2feb623b014dc4436a0b44016f0a0fa4bba9"} err="failed to get container status \"b9bb7292f26b129ba3592868fbff2feb623b014dc4436a0b44016f0a0fa4bba9\": rpc error: code = NotFound desc = could not find container \"b9bb7292f26b129ba3592868fbff2feb623b014dc4436a0b44016f0a0fa4bba9\": container with ID starting with b9bb7292f26b129ba3592868fbff2feb623b014dc4436a0b44016f0a0fa4bba9 not found: ID does not exist" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.247036 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.258272 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d163ea39-da39-4c8c-8a6a-552e55751e61" (UID: "d163ea39-da39-4c8c-8a6a-552e55751e61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:34 crc kubenswrapper[4907]: I1129 14:54:34.349166 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d163ea39-da39-4c8c-8a6a-552e55751e61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.011031 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.047751 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.062219 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.079730 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Nov 29 14:54:35 crc kubenswrapper[4907]: E1129 14:54:35.080428 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-notifier" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.080466 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-notifier" Nov 29 14:54:35 crc kubenswrapper[4907]: E1129 14:54:35.080503 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-listener" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.080512 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-listener" Nov 29 14:54:35 crc kubenswrapper[4907]: E1129 14:54:35.080542 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-api" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.080552 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-api" Nov 29 14:54:35 crc kubenswrapper[4907]: E1129 14:54:35.080570 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-evaluator" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.080578 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-evaluator" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.080861 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-listener" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.080899 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-evaluator" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.080915 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-notifier" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.080944 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" containerName="aodh-api" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.083676 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.089760 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.090318 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.090429 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-b88t7" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.090557 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.092524 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.092949 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.174225 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.174325 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-scripts\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.174352 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-internal-tls-certs\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.174399 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvlqq\" (UniqueName: \"kubernetes.io/projected/10c5cb11-136f-44a4-a616-39ed1723237e-kube-api-access-rvlqq\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.174420 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-public-tls-certs\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.174467 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-config-data\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.277339 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-scripts\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.277390 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-internal-tls-certs\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.277450 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvlqq\" (UniqueName: \"kubernetes.io/projected/10c5cb11-136f-44a4-a616-39ed1723237e-kube-api-access-rvlqq\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.277468 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-public-tls-certs\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.277496 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-config-data\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.277655 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.283269 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-scripts\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.284132 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-public-tls-certs\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.284171 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.284347 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-internal-tls-certs\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.287314 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-config-data\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.299493 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvlqq\" (UniqueName: \"kubernetes.io/projected/10c5cb11-136f-44a4-a616-39ed1723237e-kube-api-access-rvlqq\") pod \"aodh-0\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.308803 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.309239 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.311143 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.326127 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.401809 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 29 14:54:35 crc kubenswrapper[4907]: I1129 14:54:35.889720 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 29 14:54:36 crc kubenswrapper[4907]: I1129 14:54:36.022309 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"10c5cb11-136f-44a4-a616-39ed1723237e","Type":"ContainerStarted","Data":"8a5a07cf25175044421e134e9c60ba32ca0e06c4d1711a2aa427248fd2017799"} Nov 29 14:54:36 crc kubenswrapper[4907]: I1129 14:54:36.023168 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 14:54:36 crc kubenswrapper[4907]: I1129 14:54:36.032406 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 14:54:36 crc kubenswrapper[4907]: I1129 14:54:36.497219 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d163ea39-da39-4c8c-8a6a-552e55751e61" path="/var/lib/kubelet/pods/d163ea39-da39-4c8c-8a6a-552e55751e61/volumes" Nov 29 14:54:37 crc kubenswrapper[4907]: I1129 14:54:37.035220 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"10c5cb11-136f-44a4-a616-39ed1723237e","Type":"ContainerStarted","Data":"c81b19aac835c840d324b3961a7b59ad255930d8a67a01bb9b75945d77cfbf4c"} Nov 29 14:54:38 crc kubenswrapper[4907]: I1129 14:54:38.055514 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"10c5cb11-136f-44a4-a616-39ed1723237e","Type":"ContainerStarted","Data":"6bad483b902c5fb5512c2928e012996114decb1d42421e75d2632c700f8acfe7"} Nov 29 14:54:38 crc kubenswrapper[4907]: I1129 14:54:38.439491 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 14:54:38 crc kubenswrapper[4907]: I1129 14:54:38.440494 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 14:54:38 crc kubenswrapper[4907]: I1129 14:54:38.447725 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 14:54:39 crc kubenswrapper[4907]: I1129 14:54:39.091514 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"10c5cb11-136f-44a4-a616-39ed1723237e","Type":"ContainerStarted","Data":"03b493e8e697ff13a13c62970b81ca8b601cfb98dc11ede4f617cbf54546b31f"} Nov 29 14:54:39 crc kubenswrapper[4907]: I1129 14:54:39.098395 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 14:54:40 crc kubenswrapper[4907]: I1129 14:54:40.104749 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"10c5cb11-136f-44a4-a616-39ed1723237e","Type":"ContainerStarted","Data":"94a71fea1213343a7a067f53646d435c8e0e7770ab67d13a6d620564f004b446"} Nov 29 14:54:40 crc kubenswrapper[4907]: I1129 14:54:40.133800 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.112260878 podStartE2EDuration="5.133781329s" podCreationTimestamp="2025-11-29 14:54:35 +0000 UTC" firstStartedPulling="2025-11-29 14:54:35.905160574 +0000 UTC m=+1573.891998266" lastFinishedPulling="2025-11-29 14:54:38.926681055 +0000 UTC m=+1576.913518717" observedRunningTime="2025-11-29 14:54:40.12423638 +0000 UTC m=+1578.111074032" watchObservedRunningTime="2025-11-29 14:54:40.133781329 +0000 UTC m=+1578.120618981" Nov 29 14:54:41 crc kubenswrapper[4907]: I1129 14:54:41.215171 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 29 14:54:45 crc kubenswrapper[4907]: I1129 14:54:45.921297 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 14:54:45 crc kubenswrapper[4907]: I1129 14:54:45.922103 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ce177523-3519-4f04-b71c-7869b8bb5810" containerName="kube-state-metrics" containerID="cri-o://cf8589876a6e854b169096f90c834a8b201454ffd021d1f879afe255287d27af" gracePeriod=30 Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.022941 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.023894 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="27a0a09a-72e9-4d7f-94cf-fe1717484497" containerName="mysqld-exporter" containerID="cri-o://de4679883cddae5aaa50a227b5de2694df612001c9159facbbd6ba65cf25694b" gracePeriod=30 Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.178298 4907 generic.go:334] "Generic (PLEG): container finished" podID="27a0a09a-72e9-4d7f-94cf-fe1717484497" containerID="de4679883cddae5aaa50a227b5de2694df612001c9159facbbd6ba65cf25694b" exitCode=2 Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.178397 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"27a0a09a-72e9-4d7f-94cf-fe1717484497","Type":"ContainerDied","Data":"de4679883cddae5aaa50a227b5de2694df612001c9159facbbd6ba65cf25694b"} Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.180974 4907 generic.go:334] "Generic (PLEG): container finished" podID="ce177523-3519-4f04-b71c-7869b8bb5810" containerID="cf8589876a6e854b169096f90c834a8b201454ffd021d1f879afe255287d27af" exitCode=2 Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.181021 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ce177523-3519-4f04-b71c-7869b8bb5810","Type":"ContainerDied","Data":"cf8589876a6e854b169096f90c834a8b201454ffd021d1f879afe255287d27af"} Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.468019 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.490494 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mqk5\" (UniqueName: \"kubernetes.io/projected/ce177523-3519-4f04-b71c-7869b8bb5810-kube-api-access-6mqk5\") pod \"ce177523-3519-4f04-b71c-7869b8bb5810\" (UID: \"ce177523-3519-4f04-b71c-7869b8bb5810\") " Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.498363 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce177523-3519-4f04-b71c-7869b8bb5810-kube-api-access-6mqk5" (OuterVolumeSpecName: "kube-api-access-6mqk5") pod "ce177523-3519-4f04-b71c-7869b8bb5810" (UID: "ce177523-3519-4f04-b71c-7869b8bb5810"). InnerVolumeSpecName "kube-api-access-6mqk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.562240 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.592197 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a0a09a-72e9-4d7f-94cf-fe1717484497-config-data\") pod \"27a0a09a-72e9-4d7f-94cf-fe1717484497\" (UID: \"27a0a09a-72e9-4d7f-94cf-fe1717484497\") " Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.592268 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44ltd\" (UniqueName: \"kubernetes.io/projected/27a0a09a-72e9-4d7f-94cf-fe1717484497-kube-api-access-44ltd\") pod \"27a0a09a-72e9-4d7f-94cf-fe1717484497\" (UID: \"27a0a09a-72e9-4d7f-94cf-fe1717484497\") " Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.592306 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a0a09a-72e9-4d7f-94cf-fe1717484497-combined-ca-bundle\") pod \"27a0a09a-72e9-4d7f-94cf-fe1717484497\" (UID: \"27a0a09a-72e9-4d7f-94cf-fe1717484497\") " Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.593096 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mqk5\" (UniqueName: \"kubernetes.io/projected/ce177523-3519-4f04-b71c-7869b8bb5810-kube-api-access-6mqk5\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.598916 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a0a09a-72e9-4d7f-94cf-fe1717484497-kube-api-access-44ltd" (OuterVolumeSpecName: "kube-api-access-44ltd") pod "27a0a09a-72e9-4d7f-94cf-fe1717484497" (UID: "27a0a09a-72e9-4d7f-94cf-fe1717484497"). InnerVolumeSpecName "kube-api-access-44ltd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.644706 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a0a09a-72e9-4d7f-94cf-fe1717484497-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27a0a09a-72e9-4d7f-94cf-fe1717484497" (UID: "27a0a09a-72e9-4d7f-94cf-fe1717484497"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.667844 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a0a09a-72e9-4d7f-94cf-fe1717484497-config-data" (OuterVolumeSpecName: "config-data") pod "27a0a09a-72e9-4d7f-94cf-fe1717484497" (UID: "27a0a09a-72e9-4d7f-94cf-fe1717484497"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.695298 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a0a09a-72e9-4d7f-94cf-fe1717484497-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.695497 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44ltd\" (UniqueName: \"kubernetes.io/projected/27a0a09a-72e9-4d7f-94cf-fe1717484497-kube-api-access-44ltd\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:46 crc kubenswrapper[4907]: I1129 14:54:46.695624 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a0a09a-72e9-4d7f-94cf-fe1717484497-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.200267 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"27a0a09a-72e9-4d7f-94cf-fe1717484497","Type":"ContainerDied","Data":"e09f09d6ae4eb2316d97d4fb4e2015887acaeca76f3ca6a4f35ecd62ec28f393"} Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.200343 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.200633 4907 scope.go:117] "RemoveContainer" containerID="de4679883cddae5aaa50a227b5de2694df612001c9159facbbd6ba65cf25694b" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.203351 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ce177523-3519-4f04-b71c-7869b8bb5810","Type":"ContainerDied","Data":"e0c6a10f72f299b01eeb9c055ed134aaa1ed211f1eb60bedca5dca5f580d06f5"} Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.204119 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.271007 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.287096 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.304771 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Nov 29 14:54:47 crc kubenswrapper[4907]: E1129 14:54:47.306260 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a0a09a-72e9-4d7f-94cf-fe1717484497" containerName="mysqld-exporter" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.306327 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a0a09a-72e9-4d7f-94cf-fe1717484497" containerName="mysqld-exporter" Nov 29 14:54:47 crc kubenswrapper[4907]: E1129 14:54:47.306408 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce177523-3519-4f04-b71c-7869b8bb5810" containerName="kube-state-metrics" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.306424 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce177523-3519-4f04-b71c-7869b8bb5810" containerName="kube-state-metrics" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.307542 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a0a09a-72e9-4d7f-94cf-fe1717484497" containerName="mysqld-exporter" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.307632 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce177523-3519-4f04-b71c-7869b8bb5810" containerName="kube-state-metrics" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.309560 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.313263 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.314335 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.319908 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.340827 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.343011 4907 scope.go:117] "RemoveContainer" containerID="cf8589876a6e854b169096f90c834a8b201454ffd021d1f879afe255287d27af" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.360703 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.385238 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.394409 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.397215 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.397491 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.415535 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.426109 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e119dfa1-0a93-4e7a-9b97-8530dbde1fbc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e119dfa1-0a93-4e7a-9b97-8530dbde1fbc\") " pod="openstack/kube-state-metrics-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.426268 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xckbq\" (UniqueName: \"kubernetes.io/projected/e119dfa1-0a93-4e7a-9b97-8530dbde1fbc-kube-api-access-xckbq\") pod \"kube-state-metrics-0\" (UID: \"e119dfa1-0a93-4e7a-9b97-8530dbde1fbc\") " pod="openstack/kube-state-metrics-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.426379 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/406365ac-b529-4ca8-be52-8b802da87feb-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"406365ac-b529-4ca8-be52-8b802da87feb\") " pod="openstack/mysqld-exporter-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.426584 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406365ac-b529-4ca8-be52-8b802da87feb-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"406365ac-b529-4ca8-be52-8b802da87feb\") " pod="openstack/mysqld-exporter-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.428885 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e119dfa1-0a93-4e7a-9b97-8530dbde1fbc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e119dfa1-0a93-4e7a-9b97-8530dbde1fbc\") " pod="openstack/kube-state-metrics-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.429031 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406365ac-b529-4ca8-be52-8b802da87feb-config-data\") pod \"mysqld-exporter-0\" (UID: \"406365ac-b529-4ca8-be52-8b802da87feb\") " pod="openstack/mysqld-exporter-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.429275 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df77h\" (UniqueName: \"kubernetes.io/projected/406365ac-b529-4ca8-be52-8b802da87feb-kube-api-access-df77h\") pod \"mysqld-exporter-0\" (UID: \"406365ac-b529-4ca8-be52-8b802da87feb\") " pod="openstack/mysqld-exporter-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.429469 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e119dfa1-0a93-4e7a-9b97-8530dbde1fbc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e119dfa1-0a93-4e7a-9b97-8530dbde1fbc\") " pod="openstack/kube-state-metrics-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.531694 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406365ac-b529-4ca8-be52-8b802da87feb-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"406365ac-b529-4ca8-be52-8b802da87feb\") " pod="openstack/mysqld-exporter-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.531754 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e119dfa1-0a93-4e7a-9b97-8530dbde1fbc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e119dfa1-0a93-4e7a-9b97-8530dbde1fbc\") " pod="openstack/kube-state-metrics-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.531797 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406365ac-b529-4ca8-be52-8b802da87feb-config-data\") pod \"mysqld-exporter-0\" (UID: \"406365ac-b529-4ca8-be52-8b802da87feb\") " pod="openstack/mysqld-exporter-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.531894 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df77h\" (UniqueName: \"kubernetes.io/projected/406365ac-b529-4ca8-be52-8b802da87feb-kube-api-access-df77h\") pod \"mysqld-exporter-0\" (UID: \"406365ac-b529-4ca8-be52-8b802da87feb\") " pod="openstack/mysqld-exporter-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.531957 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e119dfa1-0a93-4e7a-9b97-8530dbde1fbc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e119dfa1-0a93-4e7a-9b97-8530dbde1fbc\") " pod="openstack/kube-state-metrics-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.532098 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e119dfa1-0a93-4e7a-9b97-8530dbde1fbc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e119dfa1-0a93-4e7a-9b97-8530dbde1fbc\") " pod="openstack/kube-state-metrics-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.532129 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xckbq\" (UniqueName: \"kubernetes.io/projected/e119dfa1-0a93-4e7a-9b97-8530dbde1fbc-kube-api-access-xckbq\") pod \"kube-state-metrics-0\" (UID: \"e119dfa1-0a93-4e7a-9b97-8530dbde1fbc\") " pod="openstack/kube-state-metrics-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.532152 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/406365ac-b529-4ca8-be52-8b802da87feb-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"406365ac-b529-4ca8-be52-8b802da87feb\") " pod="openstack/mysqld-exporter-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.535926 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406365ac-b529-4ca8-be52-8b802da87feb-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"406365ac-b529-4ca8-be52-8b802da87feb\") " pod="openstack/mysqld-exporter-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.535991 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e119dfa1-0a93-4e7a-9b97-8530dbde1fbc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e119dfa1-0a93-4e7a-9b97-8530dbde1fbc\") " pod="openstack/kube-state-metrics-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.536712 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406365ac-b529-4ca8-be52-8b802da87feb-config-data\") pod \"mysqld-exporter-0\" (UID: \"406365ac-b529-4ca8-be52-8b802da87feb\") " pod="openstack/mysqld-exporter-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.537220 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e119dfa1-0a93-4e7a-9b97-8530dbde1fbc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e119dfa1-0a93-4e7a-9b97-8530dbde1fbc\") " pod="openstack/kube-state-metrics-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.538828 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/406365ac-b529-4ca8-be52-8b802da87feb-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"406365ac-b529-4ca8-be52-8b802da87feb\") " pod="openstack/mysqld-exporter-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.556735 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e119dfa1-0a93-4e7a-9b97-8530dbde1fbc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e119dfa1-0a93-4e7a-9b97-8530dbde1fbc\") " pod="openstack/kube-state-metrics-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.557429 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xckbq\" (UniqueName: \"kubernetes.io/projected/e119dfa1-0a93-4e7a-9b97-8530dbde1fbc-kube-api-access-xckbq\") pod \"kube-state-metrics-0\" (UID: \"e119dfa1-0a93-4e7a-9b97-8530dbde1fbc\") " pod="openstack/kube-state-metrics-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.558581 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df77h\" (UniqueName: \"kubernetes.io/projected/406365ac-b529-4ca8-be52-8b802da87feb-kube-api-access-df77h\") pod \"mysqld-exporter-0\" (UID: \"406365ac-b529-4ca8-be52-8b802da87feb\") " pod="openstack/mysqld-exporter-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.653030 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Nov 29 14:54:47 crc kubenswrapper[4907]: I1129 14:54:47.731606 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.024571 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.025128 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="ceilometer-central-agent" containerID="cri-o://2922cbdfbd770cb8ce916ff3a850b865569b2ca87ab34156c183a9bbc35df4f4" gracePeriod=30 Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.025631 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="proxy-httpd" containerID="cri-o://0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f" gracePeriod=30 Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.026002 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="ceilometer-notification-agent" containerID="cri-o://060f1646219e88a30ef3d950bf8dce9220c9c32c39828668d24a082537d4d48a" gracePeriod=30 Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.026050 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="sg-core" containerID="cri-o://33268e28702d657a5731d84b1cff388994ed874be71a0486c8fbd570a8f5ddf5" gracePeriod=30 Nov 29 14:54:48 crc kubenswrapper[4907]: W1129 14:54:48.216524 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod406365ac_b529_4ca8_be52_8b802da87feb.slice/crio-b6ee0a10f27e58f12c547fbd64647be1d2419d939ab6a1d93ebadf942e121120 WatchSource:0}: Error finding container b6ee0a10f27e58f12c547fbd64647be1d2419d939ab6a1d93ebadf942e121120: Status 404 returned error can't find the container with id b6ee0a10f27e58f12c547fbd64647be1d2419d939ab6a1d93ebadf942e121120 Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.217324 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.221817 4907 generic.go:334] "Generic (PLEG): container finished" podID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerID="33268e28702d657a5731d84b1cff388994ed874be71a0486c8fbd570a8f5ddf5" exitCode=2 Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.221879 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5f072fc-976b-4cc4-8c87-fe3c52ab9829","Type":"ContainerDied","Data":"33268e28702d657a5731d84b1cff388994ed874be71a0486c8fbd570a8f5ddf5"} Nov 29 14:54:48 crc kubenswrapper[4907]: E1129 14:54:48.288905 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5f072fc_976b_4cc4_8c87_fe3c52ab9829.slice/crio-0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f.scope\": RecentStats: unable to find data in memory cache]" Nov 29 14:54:48 crc kubenswrapper[4907]: E1129 14:54:48.290588 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5f072fc_976b_4cc4_8c87_fe3c52ab9829.slice/crio-conmon-0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5f072fc_976b_4cc4_8c87_fe3c52ab9829.slice/crio-0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f.scope\": RecentStats: unable to find data in memory cache]" Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.329626 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.496620 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a0a09a-72e9-4d7f-94cf-fe1717484497" path="/var/lib/kubelet/pods/27a0a09a-72e9-4d7f-94cf-fe1717484497/volumes" Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.497861 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce177523-3519-4f04-b71c-7869b8bb5810" path="/var/lib/kubelet/pods/ce177523-3519-4f04-b71c-7869b8bb5810/volumes" Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.499825 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lk45h"] Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.506163 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.508139 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lk45h"] Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.574151 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a8933b-b26c-4401-aa75-bb1e4796d546-catalog-content\") pod \"community-operators-lk45h\" (UID: \"d5a8933b-b26c-4401-aa75-bb1e4796d546\") " pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.574497 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a8933b-b26c-4401-aa75-bb1e4796d546-utilities\") pod \"community-operators-lk45h\" (UID: \"d5a8933b-b26c-4401-aa75-bb1e4796d546\") " pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.575282 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw946\" (UniqueName: \"kubernetes.io/projected/d5a8933b-b26c-4401-aa75-bb1e4796d546-kube-api-access-kw946\") pod \"community-operators-lk45h\" (UID: \"d5a8933b-b26c-4401-aa75-bb1e4796d546\") " pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.677998 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a8933b-b26c-4401-aa75-bb1e4796d546-catalog-content\") pod \"community-operators-lk45h\" (UID: \"d5a8933b-b26c-4401-aa75-bb1e4796d546\") " pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.678241 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a8933b-b26c-4401-aa75-bb1e4796d546-utilities\") pod \"community-operators-lk45h\" (UID: \"d5a8933b-b26c-4401-aa75-bb1e4796d546\") " pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.678526 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw946\" (UniqueName: \"kubernetes.io/projected/d5a8933b-b26c-4401-aa75-bb1e4796d546-kube-api-access-kw946\") pod \"community-operators-lk45h\" (UID: \"d5a8933b-b26c-4401-aa75-bb1e4796d546\") " pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.678714 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a8933b-b26c-4401-aa75-bb1e4796d546-catalog-content\") pod \"community-operators-lk45h\" (UID: \"d5a8933b-b26c-4401-aa75-bb1e4796d546\") " pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.678742 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a8933b-b26c-4401-aa75-bb1e4796d546-utilities\") pod \"community-operators-lk45h\" (UID: \"d5a8933b-b26c-4401-aa75-bb1e4796d546\") " pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.702128 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw946\" (UniqueName: \"kubernetes.io/projected/d5a8933b-b26c-4401-aa75-bb1e4796d546-kube-api-access-kw946\") pod \"community-operators-lk45h\" (UID: \"d5a8933b-b26c-4401-aa75-bb1e4796d546\") " pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:54:48 crc kubenswrapper[4907]: I1129 14:54:48.833428 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:54:49 crc kubenswrapper[4907]: I1129 14:54:49.239618 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5f072fc-976b-4cc4-8c87-fe3c52ab9829","Type":"ContainerDied","Data":"0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f"} Nov 29 14:54:49 crc kubenswrapper[4907]: I1129 14:54:49.239516 4907 generic.go:334] "Generic (PLEG): container finished" podID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerID="0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f" exitCode=0 Nov 29 14:54:49 crc kubenswrapper[4907]: I1129 14:54:49.246323 4907 generic.go:334] "Generic (PLEG): container finished" podID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerID="2922cbdfbd770cb8ce916ff3a850b865569b2ca87ab34156c183a9bbc35df4f4" exitCode=0 Nov 29 14:54:49 crc kubenswrapper[4907]: I1129 14:54:49.246381 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5f072fc-976b-4cc4-8c87-fe3c52ab9829","Type":"ContainerDied","Data":"2922cbdfbd770cb8ce916ff3a850b865569b2ca87ab34156c183a9bbc35df4f4"} Nov 29 14:54:49 crc kubenswrapper[4907]: I1129 14:54:49.249217 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e119dfa1-0a93-4e7a-9b97-8530dbde1fbc","Type":"ContainerStarted","Data":"9a050bb70bd3f66cf1debfd90f2d13e64388e7b8e50bfae696f852bb4a5e2da3"} Nov 29 14:54:49 crc kubenswrapper[4907]: I1129 14:54:49.249280 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e119dfa1-0a93-4e7a-9b97-8530dbde1fbc","Type":"ContainerStarted","Data":"f716d072799b66d853ed046c96fa44d1e8d6996e86c1cc98349adbe26f2125a4"} Nov 29 14:54:49 crc kubenswrapper[4907]: I1129 14:54:49.249319 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 29 14:54:49 crc kubenswrapper[4907]: I1129 14:54:49.260003 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"406365ac-b529-4ca8-be52-8b802da87feb","Type":"ContainerStarted","Data":"bd2667fc9a169dfa908468dab710bf875fce5b68347d10e05d9d88177b984471"} Nov 29 14:54:49 crc kubenswrapper[4907]: I1129 14:54:49.260040 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"406365ac-b529-4ca8-be52-8b802da87feb","Type":"ContainerStarted","Data":"b6ee0a10f27e58f12c547fbd64647be1d2419d939ab6a1d93ebadf942e121120"} Nov 29 14:54:49 crc kubenswrapper[4907]: I1129 14:54:49.274715 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.874848653 podStartE2EDuration="2.274697816s" podCreationTimestamp="2025-11-29 14:54:47 +0000 UTC" firstStartedPulling="2025-11-29 14:54:48.33849284 +0000 UTC m=+1586.325330482" lastFinishedPulling="2025-11-29 14:54:48.738341993 +0000 UTC m=+1586.725179645" observedRunningTime="2025-11-29 14:54:49.266288639 +0000 UTC m=+1587.253126291" watchObservedRunningTime="2025-11-29 14:54:49.274697816 +0000 UTC m=+1587.261535468" Nov 29 14:54:49 crc kubenswrapper[4907]: I1129 14:54:49.312561 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=1.6833555329999998 podStartE2EDuration="2.312537173s" podCreationTimestamp="2025-11-29 14:54:47 +0000 UTC" firstStartedPulling="2025-11-29 14:54:48.220576815 +0000 UTC m=+1586.207414457" lastFinishedPulling="2025-11-29 14:54:48.849758445 +0000 UTC m=+1586.836596097" observedRunningTime="2025-11-29 14:54:49.283666129 +0000 UTC m=+1587.270503781" watchObservedRunningTime="2025-11-29 14:54:49.312537173 +0000 UTC m=+1587.299374825" Nov 29 14:54:49 crc kubenswrapper[4907]: I1129 14:54:49.423212 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lk45h"] Nov 29 14:54:49 crc kubenswrapper[4907]: W1129 14:54:49.431524 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a8933b_b26c_4401_aa75_bb1e4796d546.slice/crio-c57e49543f7d5675a01eac8051da7da6450b0af2c98700ee96d8736d8719acbf WatchSource:0}: Error finding container c57e49543f7d5675a01eac8051da7da6450b0af2c98700ee96d8736d8719acbf: Status 404 returned error can't find the container with id c57e49543f7d5675a01eac8051da7da6450b0af2c98700ee96d8736d8719acbf Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.023336 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.120825 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrhnh\" (UniqueName: \"kubernetes.io/projected/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-kube-api-access-mrhnh\") pod \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.121113 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-scripts\") pod \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.121177 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-combined-ca-bundle\") pod \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.121237 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-run-httpd\") pod \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.121385 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-sg-core-conf-yaml\") pod \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.121462 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-log-httpd\") pod \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.121538 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-config-data\") pod \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\" (UID: \"a5f072fc-976b-4cc4-8c87-fe3c52ab9829\") " Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.123907 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a5f072fc-976b-4cc4-8c87-fe3c52ab9829" (UID: "a5f072fc-976b-4cc4-8c87-fe3c52ab9829"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.126617 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a5f072fc-976b-4cc4-8c87-fe3c52ab9829" (UID: "a5f072fc-976b-4cc4-8c87-fe3c52ab9829"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.129568 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-scripts" (OuterVolumeSpecName: "scripts") pod "a5f072fc-976b-4cc4-8c87-fe3c52ab9829" (UID: "a5f072fc-976b-4cc4-8c87-fe3c52ab9829"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.131630 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-kube-api-access-mrhnh" (OuterVolumeSpecName: "kube-api-access-mrhnh") pod "a5f072fc-976b-4cc4-8c87-fe3c52ab9829" (UID: "a5f072fc-976b-4cc4-8c87-fe3c52ab9829"). InnerVolumeSpecName "kube-api-access-mrhnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.168580 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a5f072fc-976b-4cc4-8c87-fe3c52ab9829" (UID: "a5f072fc-976b-4cc4-8c87-fe3c52ab9829"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.224729 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrhnh\" (UniqueName: \"kubernetes.io/projected/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-kube-api-access-mrhnh\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.224760 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.224770 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.224782 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.224790 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.260059 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5f072fc-976b-4cc4-8c87-fe3c52ab9829" (UID: "a5f072fc-976b-4cc4-8c87-fe3c52ab9829"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.276134 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-config-data" (OuterVolumeSpecName: "config-data") pod "a5f072fc-976b-4cc4-8c87-fe3c52ab9829" (UID: "a5f072fc-976b-4cc4-8c87-fe3c52ab9829"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.279836 4907 generic.go:334] "Generic (PLEG): container finished" podID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerID="060f1646219e88a30ef3d950bf8dce9220c9c32c39828668d24a082537d4d48a" exitCode=0 Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.279923 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.279907 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5f072fc-976b-4cc4-8c87-fe3c52ab9829","Type":"ContainerDied","Data":"060f1646219e88a30ef3d950bf8dce9220c9c32c39828668d24a082537d4d48a"} Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.281020 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5f072fc-976b-4cc4-8c87-fe3c52ab9829","Type":"ContainerDied","Data":"6b53c5eae9b91f0726ec0da1501e5d61dfa42e8f0706cf169bf535549ad281ce"} Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.281054 4907 scope.go:117] "RemoveContainer" containerID="0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.293406 4907 generic.go:334] "Generic (PLEG): container finished" podID="d5a8933b-b26c-4401-aa75-bb1e4796d546" containerID="763157a01a09c74ad9db4bc011a29082c64eb93ef4cc4881c3aa84f67e627293" exitCode=0 Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.293955 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lk45h" event={"ID":"d5a8933b-b26c-4401-aa75-bb1e4796d546","Type":"ContainerDied","Data":"763157a01a09c74ad9db4bc011a29082c64eb93ef4cc4881c3aa84f67e627293"} Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.295638 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lk45h" event={"ID":"d5a8933b-b26c-4401-aa75-bb1e4796d546","Type":"ContainerStarted","Data":"c57e49543f7d5675a01eac8051da7da6450b0af2c98700ee96d8736d8719acbf"} Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.327693 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.327720 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5f072fc-976b-4cc4-8c87-fe3c52ab9829-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.384790 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.389684 4907 scope.go:117] "RemoveContainer" containerID="33268e28702d657a5731d84b1cff388994ed874be71a0486c8fbd570a8f5ddf5" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.405718 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.413520 4907 scope.go:117] "RemoveContainer" containerID="060f1646219e88a30ef3d950bf8dce9220c9c32c39828668d24a082537d4d48a" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.417935 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:54:50 crc kubenswrapper[4907]: E1129 14:54:50.418481 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="ceilometer-central-agent" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.418499 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="ceilometer-central-agent" Nov 29 14:54:50 crc kubenswrapper[4907]: E1129 14:54:50.418525 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="proxy-httpd" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.418532 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="proxy-httpd" Nov 29 14:54:50 crc kubenswrapper[4907]: E1129 14:54:50.418561 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="ceilometer-notification-agent" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.418569 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="ceilometer-notification-agent" Nov 29 14:54:50 crc kubenswrapper[4907]: E1129 14:54:50.418586 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="sg-core" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.418594 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="sg-core" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.418800 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="proxy-httpd" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.418832 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="sg-core" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.418851 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="ceilometer-central-agent" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.418870 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" containerName="ceilometer-notification-agent" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.422425 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.424964 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.425019 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.426649 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.430500 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.443772 4907 scope.go:117] "RemoveContainer" containerID="2922cbdfbd770cb8ce916ff3a850b865569b2ca87ab34156c183a9bbc35df4f4" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.466983 4907 scope.go:117] "RemoveContainer" containerID="0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f" Nov 29 14:54:50 crc kubenswrapper[4907]: E1129 14:54:50.468338 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f\": container with ID starting with 0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f not found: ID does not exist" containerID="0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.468403 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f"} err="failed to get container status \"0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f\": rpc error: code = NotFound desc = could not find container \"0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f\": container with ID starting with 0548e4761e4b81dde1b6120e195543687bbd5793765c88f0b45301f7ba4a240f not found: ID does not exist" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.468480 4907 scope.go:117] "RemoveContainer" containerID="33268e28702d657a5731d84b1cff388994ed874be71a0486c8fbd570a8f5ddf5" Nov 29 14:54:50 crc kubenswrapper[4907]: E1129 14:54:50.470503 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33268e28702d657a5731d84b1cff388994ed874be71a0486c8fbd570a8f5ddf5\": container with ID starting with 33268e28702d657a5731d84b1cff388994ed874be71a0486c8fbd570a8f5ddf5 not found: ID does not exist" containerID="33268e28702d657a5731d84b1cff388994ed874be71a0486c8fbd570a8f5ddf5" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.470561 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33268e28702d657a5731d84b1cff388994ed874be71a0486c8fbd570a8f5ddf5"} err="failed to get container status \"33268e28702d657a5731d84b1cff388994ed874be71a0486c8fbd570a8f5ddf5\": rpc error: code = NotFound desc = could not find container \"33268e28702d657a5731d84b1cff388994ed874be71a0486c8fbd570a8f5ddf5\": container with ID starting with 33268e28702d657a5731d84b1cff388994ed874be71a0486c8fbd570a8f5ddf5 not found: ID does not exist" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.470586 4907 scope.go:117] "RemoveContainer" containerID="060f1646219e88a30ef3d950bf8dce9220c9c32c39828668d24a082537d4d48a" Nov 29 14:54:50 crc kubenswrapper[4907]: E1129 14:54:50.471177 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"060f1646219e88a30ef3d950bf8dce9220c9c32c39828668d24a082537d4d48a\": container with ID starting with 060f1646219e88a30ef3d950bf8dce9220c9c32c39828668d24a082537d4d48a not found: ID does not exist" containerID="060f1646219e88a30ef3d950bf8dce9220c9c32c39828668d24a082537d4d48a" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.471219 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060f1646219e88a30ef3d950bf8dce9220c9c32c39828668d24a082537d4d48a"} err="failed to get container status \"060f1646219e88a30ef3d950bf8dce9220c9c32c39828668d24a082537d4d48a\": rpc error: code = NotFound desc = could not find container \"060f1646219e88a30ef3d950bf8dce9220c9c32c39828668d24a082537d4d48a\": container with ID starting with 060f1646219e88a30ef3d950bf8dce9220c9c32c39828668d24a082537d4d48a not found: ID does not exist" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.471234 4907 scope.go:117] "RemoveContainer" containerID="2922cbdfbd770cb8ce916ff3a850b865569b2ca87ab34156c183a9bbc35df4f4" Nov 29 14:54:50 crc kubenswrapper[4907]: E1129 14:54:50.471799 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2922cbdfbd770cb8ce916ff3a850b865569b2ca87ab34156c183a9bbc35df4f4\": container with ID starting with 2922cbdfbd770cb8ce916ff3a850b865569b2ca87ab34156c183a9bbc35df4f4 not found: ID does not exist" containerID="2922cbdfbd770cb8ce916ff3a850b865569b2ca87ab34156c183a9bbc35df4f4" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.471845 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2922cbdfbd770cb8ce916ff3a850b865569b2ca87ab34156c183a9bbc35df4f4"} err="failed to get container status \"2922cbdfbd770cb8ce916ff3a850b865569b2ca87ab34156c183a9bbc35df4f4\": rpc error: code = NotFound desc = could not find container \"2922cbdfbd770cb8ce916ff3a850b865569b2ca87ab34156c183a9bbc35df4f4\": container with ID starting with 2922cbdfbd770cb8ce916ff3a850b865569b2ca87ab34156c183a9bbc35df4f4 not found: ID does not exist" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.492950 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5f072fc-976b-4cc4-8c87-fe3c52ab9829" path="/var/lib/kubelet/pods/a5f072fc-976b-4cc4-8c87-fe3c52ab9829/volumes" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.532069 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n67rz\" (UniqueName: \"kubernetes.io/projected/311d9e86-5c91-4d3c-ab5c-41dac23726cf-kube-api-access-n67rz\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.532966 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/311d9e86-5c91-4d3c-ab5c-41dac23726cf-run-httpd\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.533461 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.534054 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-scripts\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.534480 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-config-data\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.534504 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.534715 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.534767 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/311d9e86-5c91-4d3c-ab5c-41dac23726cf-log-httpd\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.637676 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-scripts\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.638010 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-config-data\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.638035 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.638087 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.638111 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/311d9e86-5c91-4d3c-ab5c-41dac23726cf-log-httpd\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.638177 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n67rz\" (UniqueName: \"kubernetes.io/projected/311d9e86-5c91-4d3c-ab5c-41dac23726cf-kube-api-access-n67rz\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.638255 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/311d9e86-5c91-4d3c-ab5c-41dac23726cf-run-httpd\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.638284 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.638754 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/311d9e86-5c91-4d3c-ab5c-41dac23726cf-run-httpd\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.638917 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/311d9e86-5c91-4d3c-ab5c-41dac23726cf-log-httpd\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.642826 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.643122 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.644116 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-config-data\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.644416 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-scripts\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.645237 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.661407 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n67rz\" (UniqueName: \"kubernetes.io/projected/311d9e86-5c91-4d3c-ab5c-41dac23726cf-kube-api-access-n67rz\") pod \"ceilometer-0\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " pod="openstack/ceilometer-0" Nov 29 14:54:50 crc kubenswrapper[4907]: I1129 14:54:50.745042 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:54:51 crc kubenswrapper[4907]: I1129 14:54:51.281489 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:54:51 crc kubenswrapper[4907]: W1129 14:54:51.281885 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod311d9e86_5c91_4d3c_ab5c_41dac23726cf.slice/crio-f90aeefbc7312ca8dd0dc0eb3890141088826bafccbb610302fd3412395950fd WatchSource:0}: Error finding container f90aeefbc7312ca8dd0dc0eb3890141088826bafccbb610302fd3412395950fd: Status 404 returned error can't find the container with id f90aeefbc7312ca8dd0dc0eb3890141088826bafccbb610302fd3412395950fd Nov 29 14:54:51 crc kubenswrapper[4907]: I1129 14:54:51.310005 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lk45h" event={"ID":"d5a8933b-b26c-4401-aa75-bb1e4796d546","Type":"ContainerStarted","Data":"33d24229d855d20f9704ca594064850a01f6d4ce21165a4b041bcacc58a706d4"} Nov 29 14:54:51 crc kubenswrapper[4907]: I1129 14:54:51.316192 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"311d9e86-5c91-4d3c-ab5c-41dac23726cf","Type":"ContainerStarted","Data":"f90aeefbc7312ca8dd0dc0eb3890141088826bafccbb610302fd3412395950fd"} Nov 29 14:54:53 crc kubenswrapper[4907]: I1129 14:54:53.344647 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"311d9e86-5c91-4d3c-ab5c-41dac23726cf","Type":"ContainerStarted","Data":"570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f"} Nov 29 14:54:53 crc kubenswrapper[4907]: I1129 14:54:53.345238 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"311d9e86-5c91-4d3c-ab5c-41dac23726cf","Type":"ContainerStarted","Data":"247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff"} Nov 29 14:54:53 crc kubenswrapper[4907]: I1129 14:54:53.348781 4907 generic.go:334] "Generic (PLEG): container finished" podID="d5a8933b-b26c-4401-aa75-bb1e4796d546" containerID="33d24229d855d20f9704ca594064850a01f6d4ce21165a4b041bcacc58a706d4" exitCode=0 Nov 29 14:54:53 crc kubenswrapper[4907]: I1129 14:54:53.348820 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lk45h" event={"ID":"d5a8933b-b26c-4401-aa75-bb1e4796d546","Type":"ContainerDied","Data":"33d24229d855d20f9704ca594064850a01f6d4ce21165a4b041bcacc58a706d4"} Nov 29 14:54:54 crc kubenswrapper[4907]: I1129 14:54:54.360490 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lk45h" event={"ID":"d5a8933b-b26c-4401-aa75-bb1e4796d546","Type":"ContainerStarted","Data":"dfd3432c813a00dc84779453ccc111ec776f4ff126970732ca1fbb283cd432b6"} Nov 29 14:54:54 crc kubenswrapper[4907]: I1129 14:54:54.364312 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"311d9e86-5c91-4d3c-ab5c-41dac23726cf","Type":"ContainerStarted","Data":"c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc"} Nov 29 14:54:54 crc kubenswrapper[4907]: I1129 14:54:54.385513 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lk45h" podStartSLOduration=2.870022106 podStartE2EDuration="6.385490454s" podCreationTimestamp="2025-11-29 14:54:48 +0000 UTC" firstStartedPulling="2025-11-29 14:54:50.298596875 +0000 UTC m=+1588.285434527" lastFinishedPulling="2025-11-29 14:54:53.814065233 +0000 UTC m=+1591.800902875" observedRunningTime="2025-11-29 14:54:54.379146035 +0000 UTC m=+1592.365983687" watchObservedRunningTime="2025-11-29 14:54:54.385490454 +0000 UTC m=+1592.372328106" Nov 29 14:54:55 crc kubenswrapper[4907]: I1129 14:54:55.784384 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-5vflp"] Nov 29 14:54:55 crc kubenswrapper[4907]: I1129 14:54:55.794908 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-5vflp"] Nov 29 14:54:55 crc kubenswrapper[4907]: I1129 14:54:55.903510 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-vz6m5"] Nov 29 14:54:55 crc kubenswrapper[4907]: I1129 14:54:55.906519 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-vz6m5" Nov 29 14:54:55 crc kubenswrapper[4907]: I1129 14:54:55.924595 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-vz6m5"] Nov 29 14:54:56 crc kubenswrapper[4907]: I1129 14:54:56.066728 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-combined-ca-bundle\") pod \"heat-db-sync-vz6m5\" (UID: \"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa\") " pod="openstack/heat-db-sync-vz6m5" Nov 29 14:54:56 crc kubenswrapper[4907]: I1129 14:54:56.067201 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-config-data\") pod \"heat-db-sync-vz6m5\" (UID: \"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa\") " pod="openstack/heat-db-sync-vz6m5" Nov 29 14:54:56 crc kubenswrapper[4907]: I1129 14:54:56.067299 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxtwg\" (UniqueName: \"kubernetes.io/projected/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-kube-api-access-lxtwg\") pod \"heat-db-sync-vz6m5\" (UID: \"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa\") " pod="openstack/heat-db-sync-vz6m5" Nov 29 14:54:56 crc kubenswrapper[4907]: I1129 14:54:56.170221 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-combined-ca-bundle\") pod \"heat-db-sync-vz6m5\" (UID: \"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa\") " pod="openstack/heat-db-sync-vz6m5" Nov 29 14:54:56 crc kubenswrapper[4907]: I1129 14:54:56.170268 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-config-data\") pod \"heat-db-sync-vz6m5\" (UID: \"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa\") " pod="openstack/heat-db-sync-vz6m5" Nov 29 14:54:56 crc kubenswrapper[4907]: I1129 14:54:56.170301 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxtwg\" (UniqueName: \"kubernetes.io/projected/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-kube-api-access-lxtwg\") pod \"heat-db-sync-vz6m5\" (UID: \"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa\") " pod="openstack/heat-db-sync-vz6m5" Nov 29 14:54:56 crc kubenswrapper[4907]: I1129 14:54:56.180103 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-config-data\") pod \"heat-db-sync-vz6m5\" (UID: \"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa\") " pod="openstack/heat-db-sync-vz6m5" Nov 29 14:54:56 crc kubenswrapper[4907]: I1129 14:54:56.190763 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-combined-ca-bundle\") pod \"heat-db-sync-vz6m5\" (UID: \"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa\") " pod="openstack/heat-db-sync-vz6m5" Nov 29 14:54:56 crc kubenswrapper[4907]: I1129 14:54:56.199792 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxtwg\" (UniqueName: \"kubernetes.io/projected/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-kube-api-access-lxtwg\") pod \"heat-db-sync-vz6m5\" (UID: \"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa\") " pod="openstack/heat-db-sync-vz6m5" Nov 29 14:54:56 crc kubenswrapper[4907]: I1129 14:54:56.255501 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-vz6m5" Nov 29 14:54:56 crc kubenswrapper[4907]: I1129 14:54:56.409843 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"311d9e86-5c91-4d3c-ab5c-41dac23726cf","Type":"ContainerStarted","Data":"0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32"} Nov 29 14:54:56 crc kubenswrapper[4907]: I1129 14:54:56.410142 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 14:54:56 crc kubenswrapper[4907]: I1129 14:54:56.453809 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.464926594 podStartE2EDuration="6.453791769s" podCreationTimestamp="2025-11-29 14:54:50 +0000 UTC" firstStartedPulling="2025-11-29 14:54:51.283580067 +0000 UTC m=+1589.270417719" lastFinishedPulling="2025-11-29 14:54:55.272445252 +0000 UTC m=+1593.259282894" observedRunningTime="2025-11-29 14:54:56.44209549 +0000 UTC m=+1594.428933162" watchObservedRunningTime="2025-11-29 14:54:56.453791769 +0000 UTC m=+1594.440629421" Nov 29 14:54:56 crc kubenswrapper[4907]: I1129 14:54:56.505046 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e3e611f-5e4c-4b2c-baea-5f74745f315b" path="/var/lib/kubelet/pods/1e3e611f-5e4c-4b2c-baea-5f74745f315b/volumes" Nov 29 14:54:56 crc kubenswrapper[4907]: I1129 14:54:56.746889 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-vz6m5"] Nov 29 14:54:57 crc kubenswrapper[4907]: I1129 14:54:57.046171 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 14:54:57 crc kubenswrapper[4907]: I1129 14:54:57.432893 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-vz6m5" event={"ID":"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa","Type":"ContainerStarted","Data":"4519476e7d3ab717fa5fce2df3ff0a889241b7a7a6677d8d0b28477addc706ab"} Nov 29 14:54:57 crc kubenswrapper[4907]: I1129 14:54:57.747675 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 29 14:54:57 crc kubenswrapper[4907]: I1129 14:54:57.976339 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 14:54:58 crc kubenswrapper[4907]: I1129 14:54:58.490375 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 14:54:58 crc kubenswrapper[4907]: I1129 14:54:58.490421 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 14:54:58 crc kubenswrapper[4907]: I1129 14:54:58.498537 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 14:54:58 crc kubenswrapper[4907]: I1129 14:54:58.499616 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 14:54:58 crc kubenswrapper[4907]: I1129 14:54:58.499698 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" gracePeriod=600 Nov 29 14:54:58 crc kubenswrapper[4907]: E1129 14:54:58.638775 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:54:58 crc kubenswrapper[4907]: I1129 14:54:58.835763 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:54:58 crc kubenswrapper[4907]: I1129 14:54:58.836005 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:54:59 crc kubenswrapper[4907]: I1129 14:54:59.172572 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:54:59 crc kubenswrapper[4907]: I1129 14:54:59.172842 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="ceilometer-central-agent" containerID="cri-o://247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff" gracePeriod=30 Nov 29 14:54:59 crc kubenswrapper[4907]: I1129 14:54:59.172987 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="proxy-httpd" containerID="cri-o://0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32" gracePeriod=30 Nov 29 14:54:59 crc kubenswrapper[4907]: I1129 14:54:59.173033 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="sg-core" containerID="cri-o://c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc" gracePeriod=30 Nov 29 14:54:59 crc kubenswrapper[4907]: I1129 14:54:59.173063 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="ceilometer-notification-agent" containerID="cri-o://570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f" gracePeriod=30 Nov 29 14:54:59 crc kubenswrapper[4907]: I1129 14:54:59.461561 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" exitCode=0 Nov 29 14:54:59 crc kubenswrapper[4907]: I1129 14:54:59.461617 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f"} Nov 29 14:54:59 crc kubenswrapper[4907]: I1129 14:54:59.461649 4907 scope.go:117] "RemoveContainer" containerID="d1627f8336c2b950c2441aa29da8e2bbe0bedafb1bb7676292ab2c4a335d23b1" Nov 29 14:54:59 crc kubenswrapper[4907]: I1129 14:54:59.462413 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:54:59 crc kubenswrapper[4907]: E1129 14:54:59.462861 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:54:59 crc kubenswrapper[4907]: I1129 14:54:59.473844 4907 generic.go:334] "Generic (PLEG): container finished" podID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerID="0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32" exitCode=0 Nov 29 14:54:59 crc kubenswrapper[4907]: I1129 14:54:59.473871 4907 generic.go:334] "Generic (PLEG): container finished" podID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerID="c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc" exitCode=2 Nov 29 14:54:59 crc kubenswrapper[4907]: I1129 14:54:59.473927 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"311d9e86-5c91-4d3c-ab5c-41dac23726cf","Type":"ContainerDied","Data":"0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32"} Nov 29 14:54:59 crc kubenswrapper[4907]: I1129 14:54:59.473968 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"311d9e86-5c91-4d3c-ab5c-41dac23726cf","Type":"ContainerDied","Data":"c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc"} Nov 29 14:54:59 crc kubenswrapper[4907]: I1129 14:54:59.894041 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-lk45h" podUID="d5a8933b-b26c-4401-aa75-bb1e4796d546" containerName="registry-server" probeResult="failure" output=< Nov 29 14:54:59 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 14:54:59 crc kubenswrapper[4907]: > Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.199731 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.310386 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-scripts\") pod \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.310472 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-ceilometer-tls-certs\") pod \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.310501 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-sg-core-conf-yaml\") pod \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.310576 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/311d9e86-5c91-4d3c-ab5c-41dac23726cf-run-httpd\") pod \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.310691 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/311d9e86-5c91-4d3c-ab5c-41dac23726cf-log-httpd\") pod \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.310734 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-config-data\") pod \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.310764 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n67rz\" (UniqueName: \"kubernetes.io/projected/311d9e86-5c91-4d3c-ab5c-41dac23726cf-kube-api-access-n67rz\") pod \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.310793 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-combined-ca-bundle\") pod \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\" (UID: \"311d9e86-5c91-4d3c-ab5c-41dac23726cf\") " Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.310926 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/311d9e86-5c91-4d3c-ab5c-41dac23726cf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "311d9e86-5c91-4d3c-ab5c-41dac23726cf" (UID: "311d9e86-5c91-4d3c-ab5c-41dac23726cf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.311416 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/311d9e86-5c91-4d3c-ab5c-41dac23726cf-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.312561 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/311d9e86-5c91-4d3c-ab5c-41dac23726cf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "311d9e86-5c91-4d3c-ab5c-41dac23726cf" (UID: "311d9e86-5c91-4d3c-ab5c-41dac23726cf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.316659 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/311d9e86-5c91-4d3c-ab5c-41dac23726cf-kube-api-access-n67rz" (OuterVolumeSpecName: "kube-api-access-n67rz") pod "311d9e86-5c91-4d3c-ab5c-41dac23726cf" (UID: "311d9e86-5c91-4d3c-ab5c-41dac23726cf"). InnerVolumeSpecName "kube-api-access-n67rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.334501 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-scripts" (OuterVolumeSpecName: "scripts") pod "311d9e86-5c91-4d3c-ab5c-41dac23726cf" (UID: "311d9e86-5c91-4d3c-ab5c-41dac23726cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.395180 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "311d9e86-5c91-4d3c-ab5c-41dac23726cf" (UID: "311d9e86-5c91-4d3c-ab5c-41dac23726cf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.400662 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "311d9e86-5c91-4d3c-ab5c-41dac23726cf" (UID: "311d9e86-5c91-4d3c-ab5c-41dac23726cf"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.415310 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.415339 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.415351 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.415360 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/311d9e86-5c91-4d3c-ab5c-41dac23726cf-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.415369 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n67rz\" (UniqueName: \"kubernetes.io/projected/311d9e86-5c91-4d3c-ab5c-41dac23726cf-kube-api-access-n67rz\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.448762 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "311d9e86-5c91-4d3c-ab5c-41dac23726cf" (UID: "311d9e86-5c91-4d3c-ab5c-41dac23726cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.491427 4907 generic.go:334] "Generic (PLEG): container finished" podID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerID="570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f" exitCode=0 Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.491568 4907 generic.go:334] "Generic (PLEG): container finished" podID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerID="247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff" exitCode=0 Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.491630 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.497944 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-config-data" (OuterVolumeSpecName: "config-data") pod "311d9e86-5c91-4d3c-ab5c-41dac23726cf" (UID: "311d9e86-5c91-4d3c-ab5c-41dac23726cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.498552 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"311d9e86-5c91-4d3c-ab5c-41dac23726cf","Type":"ContainerDied","Data":"570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f"} Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.498596 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"311d9e86-5c91-4d3c-ab5c-41dac23726cf","Type":"ContainerDied","Data":"247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff"} Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.498611 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"311d9e86-5c91-4d3c-ab5c-41dac23726cf","Type":"ContainerDied","Data":"f90aeefbc7312ca8dd0dc0eb3890141088826bafccbb610302fd3412395950fd"} Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.498633 4907 scope.go:117] "RemoveContainer" containerID="0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.517340 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.517366 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311d9e86-5c91-4d3c-ab5c-41dac23726cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.528388 4907 scope.go:117] "RemoveContainer" containerID="c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.554201 4907 scope.go:117] "RemoveContainer" containerID="570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.578936 4907 scope.go:117] "RemoveContainer" containerID="247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.604925 4907 scope.go:117] "RemoveContainer" containerID="0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32" Nov 29 14:55:00 crc kubenswrapper[4907]: E1129 14:55:00.605385 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32\": container with ID starting with 0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32 not found: ID does not exist" containerID="0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.605426 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32"} err="failed to get container status \"0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32\": rpc error: code = NotFound desc = could not find container \"0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32\": container with ID starting with 0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32 not found: ID does not exist" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.605469 4907 scope.go:117] "RemoveContainer" containerID="c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc" Nov 29 14:55:00 crc kubenswrapper[4907]: E1129 14:55:00.605838 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc\": container with ID starting with c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc not found: ID does not exist" containerID="c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.605867 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc"} err="failed to get container status \"c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc\": rpc error: code = NotFound desc = could not find container \"c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc\": container with ID starting with c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc not found: ID does not exist" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.605884 4907 scope.go:117] "RemoveContainer" containerID="570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f" Nov 29 14:55:00 crc kubenswrapper[4907]: E1129 14:55:00.606151 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f\": container with ID starting with 570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f not found: ID does not exist" containerID="570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.606182 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f"} err="failed to get container status \"570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f\": rpc error: code = NotFound desc = could not find container \"570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f\": container with ID starting with 570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f not found: ID does not exist" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.606205 4907 scope.go:117] "RemoveContainer" containerID="247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff" Nov 29 14:55:00 crc kubenswrapper[4907]: E1129 14:55:00.606643 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff\": container with ID starting with 247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff not found: ID does not exist" containerID="247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.606663 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff"} err="failed to get container status \"247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff\": rpc error: code = NotFound desc = could not find container \"247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff\": container with ID starting with 247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff not found: ID does not exist" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.606676 4907 scope.go:117] "RemoveContainer" containerID="0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.606916 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32"} err="failed to get container status \"0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32\": rpc error: code = NotFound desc = could not find container \"0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32\": container with ID starting with 0d13c95dcd510ff13d332276df7877d9071332902474f6020b1a5daa6e05cd32 not found: ID does not exist" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.606946 4907 scope.go:117] "RemoveContainer" containerID="c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.607166 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc"} err="failed to get container status \"c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc\": rpc error: code = NotFound desc = could not find container \"c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc\": container with ID starting with c53266a7395aa3b759dd48ebe6e70ba9be37ffaef77d5cb709c5b9c82144e8bc not found: ID does not exist" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.607184 4907 scope.go:117] "RemoveContainer" containerID="570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.607507 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f"} err="failed to get container status \"570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f\": rpc error: code = NotFound desc = could not find container \"570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f\": container with ID starting with 570acd3824a2f5fdcf40578368ec7a6424dadb50520e5a2d4803653a4bd2b40f not found: ID does not exist" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.607526 4907 scope.go:117] "RemoveContainer" containerID="247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.607812 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff"} err="failed to get container status \"247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff\": rpc error: code = NotFound desc = could not find container \"247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff\": container with ID starting with 247a91be6ea9a8a77225141d73837b589529866a922d3e9d996ef18bf118d6ff not found: ID does not exist" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.833983 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.851185 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.866632 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:55:00 crc kubenswrapper[4907]: E1129 14:55:00.867126 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="sg-core" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.867144 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="sg-core" Nov 29 14:55:00 crc kubenswrapper[4907]: E1129 14:55:00.867152 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="ceilometer-central-agent" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.867159 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="ceilometer-central-agent" Nov 29 14:55:00 crc kubenswrapper[4907]: E1129 14:55:00.867227 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="proxy-httpd" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.867234 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="proxy-httpd" Nov 29 14:55:00 crc kubenswrapper[4907]: E1129 14:55:00.867245 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="ceilometer-notification-agent" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.867251 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="ceilometer-notification-agent" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.867478 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="ceilometer-central-agent" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.867502 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="proxy-httpd" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.867510 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="ceilometer-notification-agent" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.867525 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" containerName="sg-core" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.869615 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.872640 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.872777 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.872873 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.881395 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.925799 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-run-httpd\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.926105 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.926184 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-config-data\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.926370 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-scripts\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.926618 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-log-httpd\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.926672 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.926885 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfcst\" (UniqueName: \"kubernetes.io/projected/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-kube-api-access-hfcst\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:00 crc kubenswrapper[4907]: I1129 14:55:00.926969 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.028749 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-log-httpd\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.028798 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.028850 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfcst\" (UniqueName: \"kubernetes.io/projected/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-kube-api-access-hfcst\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.028874 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.028909 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-run-httpd\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.028985 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.029014 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-config-data\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.029039 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-scripts\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.029228 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-log-httpd\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.029405 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-run-httpd\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.033657 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.033772 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.033854 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-config-data\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.033867 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-scripts\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.041823 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.045554 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfcst\" (UniqueName: \"kubernetes.io/projected/6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a-kube-api-access-hfcst\") pod \"ceilometer-0\" (UID: \"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a\") " pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.234412 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.713956 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 14:55:01 crc kubenswrapper[4907]: I1129 14:55:01.969895 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="550beb06-c1d9-4568-bb4e-66ff9134cb8e" containerName="rabbitmq" containerID="cri-o://028361a10b98546053368bb93dba0cba2998d73e54ae4ff700656f76950bbf7b" gracePeriod=604796 Nov 29 14:55:02 crc kubenswrapper[4907]: I1129 14:55:02.504802 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="8aee0179-2960-486d-8129-1e928d55a29f" containerName="rabbitmq" containerID="cri-o://5e8faf209da0c431e5051a080910785ed675c3ec5f414d6d2f380f167c75b11a" gracePeriod=604796 Nov 29 14:55:02 crc kubenswrapper[4907]: I1129 14:55:02.508643 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="311d9e86-5c91-4d3c-ab5c-41dac23726cf" path="/var/lib/kubelet/pods/311d9e86-5c91-4d3c-ab5c-41dac23726cf/volumes" Nov 29 14:55:02 crc kubenswrapper[4907]: I1129 14:55:02.541525 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a","Type":"ContainerStarted","Data":"a9b723364bfa1848cd6d75295f0e08817b11a76360f546a518544c4acc044842"} Nov 29 14:55:02 crc kubenswrapper[4907]: I1129 14:55:02.834706 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="550beb06-c1d9-4568-bb4e-66ff9134cb8e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.129:5671: connect: connection refused" Nov 29 14:55:03 crc kubenswrapper[4907]: I1129 14:55:03.302923 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="8aee0179-2960-486d-8129-1e928d55a29f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.130:5671: connect: connection refused" Nov 29 14:55:08 crc kubenswrapper[4907]: I1129 14:55:08.612948 4907 generic.go:334] "Generic (PLEG): container finished" podID="550beb06-c1d9-4568-bb4e-66ff9134cb8e" containerID="028361a10b98546053368bb93dba0cba2998d73e54ae4ff700656f76950bbf7b" exitCode=0 Nov 29 14:55:08 crc kubenswrapper[4907]: I1129 14:55:08.613054 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"550beb06-c1d9-4568-bb4e-66ff9134cb8e","Type":"ContainerDied","Data":"028361a10b98546053368bb93dba0cba2998d73e54ae4ff700656f76950bbf7b"} Nov 29 14:55:08 crc kubenswrapper[4907]: I1129 14:55:08.903159 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:55:08 crc kubenswrapper[4907]: I1129 14:55:08.967039 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:55:09 crc kubenswrapper[4907]: I1129 14:55:09.151631 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lk45h"] Nov 29 14:55:09 crc kubenswrapper[4907]: E1129 14:55:09.436005 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8aee0179_2960_486d_8129_1e928d55a29f.slice/crio-conmon-5e8faf209da0c431e5051a080910785ed675c3ec5f414d6d2f380f167c75b11a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8aee0179_2960_486d_8129_1e928d55a29f.slice/crio-5e8faf209da0c431e5051a080910785ed675c3ec5f414d6d2f380f167c75b11a.scope\": RecentStats: unable to find data in memory cache]" Nov 29 14:55:09 crc kubenswrapper[4907]: I1129 14:55:09.633154 4907 generic.go:334] "Generic (PLEG): container finished" podID="8aee0179-2960-486d-8129-1e928d55a29f" containerID="5e8faf209da0c431e5051a080910785ed675c3ec5f414d6d2f380f167c75b11a" exitCode=0 Nov 29 14:55:09 crc kubenswrapper[4907]: I1129 14:55:09.633288 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8aee0179-2960-486d-8129-1e928d55a29f","Type":"ContainerDied","Data":"5e8faf209da0c431e5051a080910785ed675c3ec5f414d6d2f380f167c75b11a"} Nov 29 14:55:10 crc kubenswrapper[4907]: I1129 14:55:10.645307 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lk45h" podUID="d5a8933b-b26c-4401-aa75-bb1e4796d546" containerName="registry-server" containerID="cri-o://dfd3432c813a00dc84779453ccc111ec776f4ff126970732ca1fbb283cd432b6" gracePeriod=2 Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.361491 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-js795"] Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.392547 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.399187 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.407946 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-js795"] Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.496381 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgvpr\" (UniqueName: \"kubernetes.io/projected/6fcc0554-5a79-4c86-b831-16dfca6c2746-kube-api-access-qgvpr\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.496479 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.496521 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.496567 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.496606 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-config\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.496956 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.496976 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.599111 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgvpr\" (UniqueName: \"kubernetes.io/projected/6fcc0554-5a79-4c86-b831-16dfca6c2746-kube-api-access-qgvpr\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.599217 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.599288 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.600143 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.600231 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.600286 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.600363 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-config\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.600422 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.600451 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.600520 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.601293 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.601508 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-config\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.601864 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.646556 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgvpr\" (UniqueName: \"kubernetes.io/projected/6fcc0554-5a79-4c86-b831-16dfca6c2746-kube-api-access-qgvpr\") pod \"dnsmasq-dns-7d84b4d45c-js795\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.698743 4907 generic.go:334] "Generic (PLEG): container finished" podID="d5a8933b-b26c-4401-aa75-bb1e4796d546" containerID="dfd3432c813a00dc84779453ccc111ec776f4ff126970732ca1fbb283cd432b6" exitCode=0 Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.698812 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lk45h" event={"ID":"d5a8933b-b26c-4401-aa75-bb1e4796d546","Type":"ContainerDied","Data":"dfd3432c813a00dc84779453ccc111ec776f4ff126970732ca1fbb283cd432b6"} Nov 29 14:55:11 crc kubenswrapper[4907]: I1129 14:55:11.725222 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:12 crc kubenswrapper[4907]: I1129 14:55:12.489989 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:55:12 crc kubenswrapper[4907]: E1129 14:55:12.490264 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:55:17 crc kubenswrapper[4907]: I1129 14:55:17.834616 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="550beb06-c1d9-4568-bb4e-66ff9134cb8e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.129:5671: i/o timeout" Nov 29 14:55:18 crc kubenswrapper[4907]: I1129 14:55:18.303640 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="8aee0179-2960-486d-8129-1e928d55a29f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.130:5671: i/o timeout" Nov 29 14:55:18 crc kubenswrapper[4907]: E1129 14:55:18.840542 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dfd3432c813a00dc84779453ccc111ec776f4ff126970732ca1fbb283cd432b6 is running failed: container process not found" containerID="dfd3432c813a00dc84779453ccc111ec776f4ff126970732ca1fbb283cd432b6" cmd=["grpc_health_probe","-addr=:50051"] Nov 29 14:55:18 crc kubenswrapper[4907]: E1129 14:55:18.844644 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dfd3432c813a00dc84779453ccc111ec776f4ff126970732ca1fbb283cd432b6 is running failed: container process not found" containerID="dfd3432c813a00dc84779453ccc111ec776f4ff126970732ca1fbb283cd432b6" cmd=["grpc_health_probe","-addr=:50051"] Nov 29 14:55:18 crc kubenswrapper[4907]: E1129 14:55:18.850579 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dfd3432c813a00dc84779453ccc111ec776f4ff126970732ca1fbb283cd432b6 is running failed: container process not found" containerID="dfd3432c813a00dc84779453ccc111ec776f4ff126970732ca1fbb283cd432b6" cmd=["grpc_health_probe","-addr=:50051"] Nov 29 14:55:18 crc kubenswrapper[4907]: E1129 14:55:18.850649 4907 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dfd3432c813a00dc84779453ccc111ec776f4ff126970732ca1fbb283cd432b6 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-lk45h" podUID="d5a8933b-b26c-4401-aa75-bb1e4796d546" containerName="registry-server" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.465731 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.472411 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.556766 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-tls\") pod \"8aee0179-2960-486d-8129-1e928d55a29f\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.556896 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-plugins\") pod \"8aee0179-2960-486d-8129-1e928d55a29f\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.556931 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-plugins-conf\") pod \"8aee0179-2960-486d-8129-1e928d55a29f\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.557054 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-server-conf\") pod \"8aee0179-2960-486d-8129-1e928d55a29f\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.557100 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-config-data\") pod \"8aee0179-2960-486d-8129-1e928d55a29f\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.557170 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5whq\" (UniqueName: \"kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-kube-api-access-p5whq\") pod \"8aee0179-2960-486d-8129-1e928d55a29f\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.557250 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-erlang-cookie\") pod \"8aee0179-2960-486d-8129-1e928d55a29f\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.557289 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-confd\") pod \"8aee0179-2960-486d-8129-1e928d55a29f\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.557363 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8aee0179-2960-486d-8129-1e928d55a29f-erlang-cookie-secret\") pod \"8aee0179-2960-486d-8129-1e928d55a29f\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.557421 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8aee0179-2960-486d-8129-1e928d55a29f-pod-info\") pod \"8aee0179-2960-486d-8129-1e928d55a29f\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.557488 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"8aee0179-2960-486d-8129-1e928d55a29f\" (UID: \"8aee0179-2960-486d-8129-1e928d55a29f\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.558908 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8aee0179-2960-486d-8129-1e928d55a29f" (UID: "8aee0179-2960-486d-8129-1e928d55a29f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.560815 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8aee0179-2960-486d-8129-1e928d55a29f" (UID: "8aee0179-2960-486d-8129-1e928d55a29f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.564901 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "8aee0179-2960-486d-8129-1e928d55a29f" (UID: "8aee0179-2960-486d-8129-1e928d55a29f"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.565208 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8aee0179-2960-486d-8129-1e928d55a29f-pod-info" (OuterVolumeSpecName: "pod-info") pod "8aee0179-2960-486d-8129-1e928d55a29f" (UID: "8aee0179-2960-486d-8129-1e928d55a29f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.579227 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8aee0179-2960-486d-8129-1e928d55a29f" (UID: "8aee0179-2960-486d-8129-1e928d55a29f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.594013 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8aee0179-2960-486d-8129-1e928d55a29f" (UID: "8aee0179-2960-486d-8129-1e928d55a29f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.602004 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-kube-api-access-p5whq" (OuterVolumeSpecName: "kube-api-access-p5whq") pod "8aee0179-2960-486d-8129-1e928d55a29f" (UID: "8aee0179-2960-486d-8129-1e928d55a29f"). InnerVolumeSpecName "kube-api-access-p5whq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.602220 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aee0179-2960-486d-8129-1e928d55a29f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8aee0179-2960-486d-8129-1e928d55a29f" (UID: "8aee0179-2960-486d-8129-1e928d55a29f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.655938 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-server-conf" (OuterVolumeSpecName: "server-conf") pod "8aee0179-2960-486d-8129-1e928d55a29f" (UID: "8aee0179-2960-486d-8129-1e928d55a29f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.662798 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-config-data\") pod \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.662848 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-confd\") pod \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.662951 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-erlang-cookie\") pod \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.662992 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqdqh\" (UniqueName: \"kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-kube-api-access-gqdqh\") pod \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.663036 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.663073 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-plugins\") pod \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.663167 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-tls\") pod \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.663224 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-server-conf\") pod \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.663259 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-plugins-conf\") pod \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.663318 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/550beb06-c1d9-4568-bb4e-66ff9134cb8e-erlang-cookie-secret\") pod \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.663352 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/550beb06-c1d9-4568-bb4e-66ff9134cb8e-pod-info\") pod \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\" (UID: \"550beb06-c1d9-4568-bb4e-66ff9134cb8e\") " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.663845 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-config-data" (OuterVolumeSpecName: "config-data") pod "8aee0179-2960-486d-8129-1e928d55a29f" (UID: "8aee0179-2960-486d-8129-1e928d55a29f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.663938 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.663951 4907 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8aee0179-2960-486d-8129-1e928d55a29f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.663961 4907 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8aee0179-2960-486d-8129-1e928d55a29f-pod-info\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.663982 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.663991 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.663999 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.664007 4907 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.664015 4907 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-server-conf\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.664022 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8aee0179-2960-486d-8129-1e928d55a29f-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.664031 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5whq\" (UniqueName: \"kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-kube-api-access-p5whq\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.677337 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "550beb06-c1d9-4568-bb4e-66ff9134cb8e" (UID: "550beb06-c1d9-4568-bb4e-66ff9134cb8e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.677746 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "550beb06-c1d9-4568-bb4e-66ff9134cb8e" (UID: "550beb06-c1d9-4568-bb4e-66ff9134cb8e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.679208 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "550beb06-c1d9-4568-bb4e-66ff9134cb8e" (UID: "550beb06-c1d9-4568-bb4e-66ff9134cb8e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.684250 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/550beb06-c1d9-4568-bb4e-66ff9134cb8e-pod-info" (OuterVolumeSpecName: "pod-info") pod "550beb06-c1d9-4568-bb4e-66ff9134cb8e" (UID: "550beb06-c1d9-4568-bb4e-66ff9134cb8e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.694523 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/550beb06-c1d9-4568-bb4e-66ff9134cb8e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "550beb06-c1d9-4568-bb4e-66ff9134cb8e" (UID: "550beb06-c1d9-4568-bb4e-66ff9134cb8e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.704789 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "550beb06-c1d9-4568-bb4e-66ff9134cb8e" (UID: "550beb06-c1d9-4568-bb4e-66ff9134cb8e"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.704827 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "550beb06-c1d9-4568-bb4e-66ff9134cb8e" (UID: "550beb06-c1d9-4568-bb4e-66ff9134cb8e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.714641 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-kube-api-access-gqdqh" (OuterVolumeSpecName: "kube-api-access-gqdqh") pod "550beb06-c1d9-4568-bb4e-66ff9134cb8e" (UID: "550beb06-c1d9-4568-bb4e-66ff9134cb8e"). InnerVolumeSpecName "kube-api-access-gqdqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.727910 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.756902 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-config-data" (OuterVolumeSpecName: "config-data") pod "550beb06-c1d9-4568-bb4e-66ff9134cb8e" (UID: "550beb06-c1d9-4568-bb4e-66ff9134cb8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.766578 4907 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/550beb06-c1d9-4568-bb4e-66ff9134cb8e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.766610 4907 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/550beb06-c1d9-4568-bb4e-66ff9134cb8e-pod-info\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.766620 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.766632 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.766642 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqdqh\" (UniqueName: \"kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-kube-api-access-gqdqh\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.766671 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.766680 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.766689 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.766700 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.766708 4907 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.786194 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-server-conf" (OuterVolumeSpecName: "server-conf") pod "550beb06-c1d9-4568-bb4e-66ff9134cb8e" (UID: "550beb06-c1d9-4568-bb4e-66ff9134cb8e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.817930 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.827915 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"550beb06-c1d9-4568-bb4e-66ff9134cb8e","Type":"ContainerDied","Data":"d1435dba1ce494ba69ef6bc36ef7a9c35cde74d6a14652960b3de681cc797049"} Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.827972 4907 scope.go:117] "RemoveContainer" containerID="028361a10b98546053368bb93dba0cba2998d73e54ae4ff700656f76950bbf7b" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.827963 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.830523 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "550beb06-c1d9-4568-bb4e-66ff9134cb8e" (UID: "550beb06-c1d9-4568-bb4e-66ff9134cb8e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.835835 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8aee0179-2960-486d-8129-1e928d55a29f","Type":"ContainerDied","Data":"404850aa49bc77f573aac1632419270aa1b073880dce5acd45edd4eb16ba0358"} Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.835954 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.846700 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8aee0179-2960-486d-8129-1e928d55a29f" (UID: "8aee0179-2960-486d-8129-1e928d55a29f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.869057 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/550beb06-c1d9-4568-bb4e-66ff9134cb8e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.869082 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8aee0179-2960-486d-8129-1e928d55a29f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.869092 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:19 crc kubenswrapper[4907]: I1129 14:55:19.869102 4907 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/550beb06-c1d9-4568-bb4e-66ff9134cb8e-server-conf\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.037366 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:55:20 crc kubenswrapper[4907]: E1129 14:55:20.057747 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Nov 29 14:55:20 crc kubenswrapper[4907]: E1129 14:55:20.057787 4907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Nov 29 14:55:20 crc kubenswrapper[4907]: E1129 14:55:20.059358 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lxtwg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-vz6m5_openstack(4cc96345-5c70-4c46-8ec2-8c53e2fe35aa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:55:20 crc kubenswrapper[4907]: E1129 14:55:20.067368 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-vz6m5" podUID="4cc96345-5c70-4c46-8ec2-8c53e2fe35aa" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.167655 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.175873 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw946\" (UniqueName: \"kubernetes.io/projected/d5a8933b-b26c-4401-aa75-bb1e4796d546-kube-api-access-kw946\") pod \"d5a8933b-b26c-4401-aa75-bb1e4796d546\" (UID: \"d5a8933b-b26c-4401-aa75-bb1e4796d546\") " Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.175922 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a8933b-b26c-4401-aa75-bb1e4796d546-utilities\") pod \"d5a8933b-b26c-4401-aa75-bb1e4796d546\" (UID: \"d5a8933b-b26c-4401-aa75-bb1e4796d546\") " Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.176184 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a8933b-b26c-4401-aa75-bb1e4796d546-catalog-content\") pod \"d5a8933b-b26c-4401-aa75-bb1e4796d546\" (UID: \"d5a8933b-b26c-4401-aa75-bb1e4796d546\") " Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.177013 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5a8933b-b26c-4401-aa75-bb1e4796d546-utilities" (OuterVolumeSpecName: "utilities") pod "d5a8933b-b26c-4401-aa75-bb1e4796d546" (UID: "d5a8933b-b26c-4401-aa75-bb1e4796d546"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.194579 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5a8933b-b26c-4401-aa75-bb1e4796d546-kube-api-access-kw946" (OuterVolumeSpecName: "kube-api-access-kw946") pod "d5a8933b-b26c-4401-aa75-bb1e4796d546" (UID: "d5a8933b-b26c-4401-aa75-bb1e4796d546"). InnerVolumeSpecName "kube-api-access-kw946". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.203976 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.226226 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.235568 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5a8933b-b26c-4401-aa75-bb1e4796d546-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5a8933b-b26c-4401-aa75-bb1e4796d546" (UID: "d5a8933b-b26c-4401-aa75-bb1e4796d546"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.240097 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 14:55:20 crc kubenswrapper[4907]: E1129 14:55:20.240676 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a8933b-b26c-4401-aa75-bb1e4796d546" containerName="extract-content" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.240693 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a8933b-b26c-4401-aa75-bb1e4796d546" containerName="extract-content" Nov 29 14:55:20 crc kubenswrapper[4907]: E1129 14:55:20.240711 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550beb06-c1d9-4568-bb4e-66ff9134cb8e" containerName="rabbitmq" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.240719 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="550beb06-c1d9-4568-bb4e-66ff9134cb8e" containerName="rabbitmq" Nov 29 14:55:20 crc kubenswrapper[4907]: E1129 14:55:20.240737 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aee0179-2960-486d-8129-1e928d55a29f" containerName="setup-container" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.240747 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aee0179-2960-486d-8129-1e928d55a29f" containerName="setup-container" Nov 29 14:55:20 crc kubenswrapper[4907]: E1129 14:55:20.240766 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a8933b-b26c-4401-aa75-bb1e4796d546" containerName="extract-utilities" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.240771 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a8933b-b26c-4401-aa75-bb1e4796d546" containerName="extract-utilities" Nov 29 14:55:20 crc kubenswrapper[4907]: E1129 14:55:20.240787 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a8933b-b26c-4401-aa75-bb1e4796d546" containerName="registry-server" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.240793 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a8933b-b26c-4401-aa75-bb1e4796d546" containerName="registry-server" Nov 29 14:55:20 crc kubenswrapper[4907]: E1129 14:55:20.240818 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550beb06-c1d9-4568-bb4e-66ff9134cb8e" containerName="setup-container" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.240824 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="550beb06-c1d9-4568-bb4e-66ff9134cb8e" containerName="setup-container" Nov 29 14:55:20 crc kubenswrapper[4907]: E1129 14:55:20.240846 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aee0179-2960-486d-8129-1e928d55a29f" containerName="rabbitmq" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.240852 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aee0179-2960-486d-8129-1e928d55a29f" containerName="rabbitmq" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.241082 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a8933b-b26c-4401-aa75-bb1e4796d546" containerName="registry-server" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.241095 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aee0179-2960-486d-8129-1e928d55a29f" containerName="rabbitmq" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.241107 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="550beb06-c1d9-4568-bb4e-66ff9134cb8e" containerName="rabbitmq" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.242352 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.246283 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.246400 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.246473 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.246504 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.246529 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.246586 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lpglp" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.246814 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.252498 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.275945 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.279147 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a8933b-b26c-4401-aa75-bb1e4796d546-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.279182 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw946\" (UniqueName: \"kubernetes.io/projected/d5a8933b-b26c-4401-aa75-bb1e4796d546-kube-api-access-kw946\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.279193 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a8933b-b26c-4401-aa75-bb1e4796d546-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.288148 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.290163 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.293542 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-qjtjp" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.293698 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.293817 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.293987 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.294156 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.295038 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.295179 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.302394 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.383775 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/07559e4d-3526-441a-a08d-e11c60e80761-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.383837 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.383879 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63f606f9-1313-4d39-8f54-78078cbd256e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.383894 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/07559e4d-3526-441a-a08d-e11c60e80761-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.383917 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/07559e4d-3526-441a-a08d-e11c60e80761-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.383934 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/07559e4d-3526-441a-a08d-e11c60e80761-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.383952 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/07559e4d-3526-441a-a08d-e11c60e80761-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.383968 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63f606f9-1313-4d39-8f54-78078cbd256e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.384010 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/07559e4d-3526-441a-a08d-e11c60e80761-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.384040 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpvft\" (UniqueName: \"kubernetes.io/projected/07559e4d-3526-441a-a08d-e11c60e80761-kube-api-access-xpvft\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.384065 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtbw7\" (UniqueName: \"kubernetes.io/projected/63f606f9-1313-4d39-8f54-78078cbd256e-kube-api-access-xtbw7\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.384093 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07559e4d-3526-441a-a08d-e11c60e80761-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.384132 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63f606f9-1313-4d39-8f54-78078cbd256e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.384153 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63f606f9-1313-4d39-8f54-78078cbd256e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.384183 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63f606f9-1313-4d39-8f54-78078cbd256e-config-data\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.384200 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63f606f9-1313-4d39-8f54-78078cbd256e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.384219 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63f606f9-1313-4d39-8f54-78078cbd256e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.384247 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/07559e4d-3526-441a-a08d-e11c60e80761-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.384263 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.384308 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63f606f9-1313-4d39-8f54-78078cbd256e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.384322 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/07559e4d-3526-441a-a08d-e11c60e80761-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.384350 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63f606f9-1313-4d39-8f54-78078cbd256e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486103 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63f606f9-1313-4d39-8f54-78078cbd256e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486423 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/07559e4d-3526-441a-a08d-e11c60e80761-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486460 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63f606f9-1313-4d39-8f54-78078cbd256e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486486 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/07559e4d-3526-441a-a08d-e11c60e80761-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486511 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486537 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/07559e4d-3526-441a-a08d-e11c60e80761-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486554 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63f606f9-1313-4d39-8f54-78078cbd256e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486578 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/07559e4d-3526-441a-a08d-e11c60e80761-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486595 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/07559e4d-3526-441a-a08d-e11c60e80761-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486613 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/07559e4d-3526-441a-a08d-e11c60e80761-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486634 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63f606f9-1313-4d39-8f54-78078cbd256e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486667 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/07559e4d-3526-441a-a08d-e11c60e80761-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486714 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpvft\" (UniqueName: \"kubernetes.io/projected/07559e4d-3526-441a-a08d-e11c60e80761-kube-api-access-xpvft\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486746 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtbw7\" (UniqueName: \"kubernetes.io/projected/63f606f9-1313-4d39-8f54-78078cbd256e-kube-api-access-xtbw7\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486784 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07559e4d-3526-441a-a08d-e11c60e80761-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486834 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63f606f9-1313-4d39-8f54-78078cbd256e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486857 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63f606f9-1313-4d39-8f54-78078cbd256e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486885 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63f606f9-1313-4d39-8f54-78078cbd256e-config-data\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486903 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63f606f9-1313-4d39-8f54-78078cbd256e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486920 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63f606f9-1313-4d39-8f54-78078cbd256e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486953 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/07559e4d-3526-441a-a08d-e11c60e80761-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486971 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.486962 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.487396 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/07559e4d-3526-441a-a08d-e11c60e80761-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.487751 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07559e4d-3526-441a-a08d-e11c60e80761-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.487863 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63f606f9-1313-4d39-8f54-78078cbd256e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.488287 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/07559e4d-3526-441a-a08d-e11c60e80761-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.488387 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/07559e4d-3526-441a-a08d-e11c60e80761-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.489500 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63f606f9-1313-4d39-8f54-78078cbd256e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.489796 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63f606f9-1313-4d39-8f54-78078cbd256e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.490017 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.490610 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63f606f9-1313-4d39-8f54-78078cbd256e-config-data\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.490061 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63f606f9-1313-4d39-8f54-78078cbd256e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.491409 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/07559e4d-3526-441a-a08d-e11c60e80761-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.494273 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/07559e4d-3526-441a-a08d-e11c60e80761-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.494393 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/07559e4d-3526-441a-a08d-e11c60e80761-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.495482 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/07559e4d-3526-441a-a08d-e11c60e80761-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.495797 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63f606f9-1313-4d39-8f54-78078cbd256e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.496065 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/07559e4d-3526-441a-a08d-e11c60e80761-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.497487 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="550beb06-c1d9-4568-bb4e-66ff9134cb8e" path="/var/lib/kubelet/pods/550beb06-c1d9-4568-bb4e-66ff9134cb8e/volumes" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.498798 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aee0179-2960-486d-8129-1e928d55a29f" path="/var/lib/kubelet/pods/8aee0179-2960-486d-8129-1e928d55a29f/volumes" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.504024 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63f606f9-1313-4d39-8f54-78078cbd256e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.504321 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63f606f9-1313-4d39-8f54-78078cbd256e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.504334 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtbw7\" (UniqueName: \"kubernetes.io/projected/63f606f9-1313-4d39-8f54-78078cbd256e-kube-api-access-xtbw7\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.510433 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpvft\" (UniqueName: \"kubernetes.io/projected/07559e4d-3526-441a-a08d-e11c60e80761-kube-api-access-xpvft\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.514962 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63f606f9-1313-4d39-8f54-78078cbd256e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.549166 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"63f606f9-1313-4d39-8f54-78078cbd256e\") " pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.558267 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"07559e4d-3526-441a-a08d-e11c60e80761\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.579083 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.614295 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:55:20 crc kubenswrapper[4907]: E1129 14:55:20.762116 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Nov 29 14:55:20 crc kubenswrapper[4907]: E1129 14:55:20.762177 4907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Nov 29 14:55:20 crc kubenswrapper[4907]: E1129 14:55:20.762326 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n59ch675h65bh8ch8dh585h5f8h676h689h554h674h566h65bh5b6hc7h645h594h567h697h647h56ch684hf8hcbh95h564h546hcfh6ch78h8fh84q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hfcst,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.784248 4907 scope.go:117] "RemoveContainer" containerID="57e9e8cb7a73ad419533e89d66570ed4fe9e23124e85292737bf8eb73653d7b3" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.851668 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lk45h" event={"ID":"d5a8933b-b26c-4401-aa75-bb1e4796d546","Type":"ContainerDied","Data":"c57e49543f7d5675a01eac8051da7da6450b0af2c98700ee96d8736d8719acbf"} Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.851755 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lk45h" Nov 29 14:55:20 crc kubenswrapper[4907]: E1129 14:55:20.859025 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-vz6m5" podUID="4cc96345-5c70-4c46-8ec2-8c53e2fe35aa" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.915410 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lk45h"] Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.918517 4907 scope.go:117] "RemoveContainer" containerID="5e8faf209da0c431e5051a080910785ed675c3ec5f414d6d2f380f167c75b11a" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.927394 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lk45h"] Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.945208 4907 scope.go:117] "RemoveContainer" containerID="b6497214a1f83cc739bcf81c7e151db9861c2777d57157ead4c53dcba818c0a6" Nov 29 14:55:20 crc kubenswrapper[4907]: I1129 14:55:20.987358 4907 scope.go:117] "RemoveContainer" containerID="dfd3432c813a00dc84779453ccc111ec776f4ff126970732ca1fbb283cd432b6" Nov 29 14:55:21 crc kubenswrapper[4907]: I1129 14:55:21.050539 4907 scope.go:117] "RemoveContainer" containerID="33d24229d855d20f9704ca594064850a01f6d4ce21165a4b041bcacc58a706d4" Nov 29 14:55:21 crc kubenswrapper[4907]: I1129 14:55:21.099983 4907 scope.go:117] "RemoveContainer" containerID="763157a01a09c74ad9db4bc011a29082c64eb93ef4cc4881c3aa84f67e627293" Nov 29 14:55:21 crc kubenswrapper[4907]: I1129 14:55:21.392476 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-js795"] Nov 29 14:55:21 crc kubenswrapper[4907]: I1129 14:55:21.506640 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 14:55:21 crc kubenswrapper[4907]: I1129 14:55:21.516521 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 14:55:21 crc kubenswrapper[4907]: W1129 14:55:21.519425 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63f606f9_1313_4d39_8f54_78078cbd256e.slice/crio-3daa58865e78f00c069009cad0f9c63b30dfdfb9467fe1772ce111a9230727f7 WatchSource:0}: Error finding container 3daa58865e78f00c069009cad0f9c63b30dfdfb9467fe1772ce111a9230727f7: Status 404 returned error can't find the container with id 3daa58865e78f00c069009cad0f9c63b30dfdfb9467fe1772ce111a9230727f7 Nov 29 14:55:21 crc kubenswrapper[4907]: I1129 14:55:21.884365 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a","Type":"ContainerStarted","Data":"8107d0c372e94949c717b42dc751220cf8a41af76fcd73dd5de9a5d98453aeb9"} Nov 29 14:55:21 crc kubenswrapper[4907]: I1129 14:55:21.887333 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"07559e4d-3526-441a-a08d-e11c60e80761","Type":"ContainerStarted","Data":"9dc93861a5fbfae95e90c61da8944a5ad681eeaa107d6ed5d5538649add2d9bf"} Nov 29 14:55:21 crc kubenswrapper[4907]: I1129 14:55:21.888528 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-js795" event={"ID":"6fcc0554-5a79-4c86-b831-16dfca6c2746","Type":"ContainerStarted","Data":"829be6df668c3809f12eeb5692a86bb676ddb6d1c240242e20fe3fa6e43f96b5"} Nov 29 14:55:21 crc kubenswrapper[4907]: I1129 14:55:21.894595 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63f606f9-1313-4d39-8f54-78078cbd256e","Type":"ContainerStarted","Data":"3daa58865e78f00c069009cad0f9c63b30dfdfb9467fe1772ce111a9230727f7"} Nov 29 14:55:22 crc kubenswrapper[4907]: I1129 14:55:22.502426 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5a8933b-b26c-4401-aa75-bb1e4796d546" path="/var/lib/kubelet/pods/d5a8933b-b26c-4401-aa75-bb1e4796d546/volumes" Nov 29 14:55:22 crc kubenswrapper[4907]: I1129 14:55:22.906511 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a","Type":"ContainerStarted","Data":"e69219f558171dcff43b6cb89d9af15a3c97fcdd14f465d25e8672f8ba125036"} Nov 29 14:55:22 crc kubenswrapper[4907]: I1129 14:55:22.907623 4907 generic.go:334] "Generic (PLEG): container finished" podID="6fcc0554-5a79-4c86-b831-16dfca6c2746" containerID="0cc4c7c4f8dcd1c97b3711440805c4a634f33275e1fbae94e456ab49050d6547" exitCode=0 Nov 29 14:55:22 crc kubenswrapper[4907]: I1129 14:55:22.907644 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-js795" event={"ID":"6fcc0554-5a79-4c86-b831-16dfca6c2746","Type":"ContainerDied","Data":"0cc4c7c4f8dcd1c97b3711440805c4a634f33275e1fbae94e456ab49050d6547"} Nov 29 14:55:23 crc kubenswrapper[4907]: I1129 14:55:23.921616 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63f606f9-1313-4d39-8f54-78078cbd256e","Type":"ContainerStarted","Data":"c3dcb5ec5ed4edcff01d567d97d496be235dac203446aaf697d44979877f223a"} Nov 29 14:55:23 crc kubenswrapper[4907]: I1129 14:55:23.928645 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"07559e4d-3526-441a-a08d-e11c60e80761","Type":"ContainerStarted","Data":"14c7d5a7f9daf26e95df26a2ead765c77633ce18da538f878087ed257830bba7"} Nov 29 14:55:23 crc kubenswrapper[4907]: I1129 14:55:23.930938 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-js795" event={"ID":"6fcc0554-5a79-4c86-b831-16dfca6c2746","Type":"ContainerStarted","Data":"c38575014ce2e95fe8364d0cc96e5a1f33522164ea727b90f4e7ccb9ca13f196"} Nov 29 14:55:23 crc kubenswrapper[4907]: I1129 14:55:23.931224 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:23 crc kubenswrapper[4907]: I1129 14:55:23.964963 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d84b4d45c-js795" podStartSLOduration=12.964940933 podStartE2EDuration="12.964940933s" podCreationTimestamp="2025-11-29 14:55:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:55:23.964913993 +0000 UTC m=+1621.951751685" watchObservedRunningTime="2025-11-29 14:55:23.964940933 +0000 UTC m=+1621.951778575" Nov 29 14:55:24 crc kubenswrapper[4907]: I1129 14:55:24.480681 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:55:24 crc kubenswrapper[4907]: E1129 14:55:24.481299 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:55:24 crc kubenswrapper[4907]: E1129 14:55:24.597960 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a" Nov 29 14:55:24 crc kubenswrapper[4907]: I1129 14:55:24.955053 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a","Type":"ContainerStarted","Data":"bd24d943b5403fe7c0634d3935a39b1a88a2952a3d7c06017a10efb088a303f4"} Nov 29 14:55:24 crc kubenswrapper[4907]: I1129 14:55:24.957220 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 14:55:24 crc kubenswrapper[4907]: E1129 14:55:24.959552 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a" Nov 29 14:55:25 crc kubenswrapper[4907]: E1129 14:55:25.966187 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a" Nov 29 14:55:31 crc kubenswrapper[4907]: I1129 14:55:31.727385 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:31 crc kubenswrapper[4907]: I1129 14:55:31.854671 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8"] Nov 29 14:55:31 crc kubenswrapper[4907]: I1129 14:55:31.855209 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" podUID="c0b2f060-1dbc-40ef-b32f-932d470fb4f6" containerName="dnsmasq-dns" containerID="cri-o://dfc52db0ff989888f36a5a0810924b1055be1105eb4b41652ff730a6a4e347bc" gracePeriod=10 Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.023331 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-7bd8m"] Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.025417 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.067346 4907 generic.go:334] "Generic (PLEG): container finished" podID="c0b2f060-1dbc-40ef-b32f-932d470fb4f6" containerID="dfc52db0ff989888f36a5a0810924b1055be1105eb4b41652ff730a6a4e347bc" exitCode=0 Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.067396 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" event={"ID":"c0b2f060-1dbc-40ef-b32f-932d470fb4f6","Type":"ContainerDied","Data":"dfc52db0ff989888f36a5a0810924b1055be1105eb4b41652ff730a6a4e347bc"} Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.070207 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-7bd8m"] Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.121157 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.121225 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw5bv\" (UniqueName: \"kubernetes.io/projected/a05a8a5d-2682-419c-abb4-3b4bb8920a68-kube-api-access-fw5bv\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.121255 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.121634 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.121953 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.122012 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.122042 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-config\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.223819 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.224199 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.224290 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.224316 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.224335 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-config\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.224365 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.224396 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw5bv\" (UniqueName: \"kubernetes.io/projected/a05a8a5d-2682-419c-abb4-3b4bb8920a68-kube-api-access-fw5bv\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.224839 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.224843 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.225396 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-config\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.225428 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.225428 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.227053 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a05a8a5d-2682-419c-abb4-3b4bb8920a68-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.248306 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw5bv\" (UniqueName: \"kubernetes.io/projected/a05a8a5d-2682-419c-abb4-3b4bb8920a68-kube-api-access-fw5bv\") pod \"dnsmasq-dns-6f6df4f56c-7bd8m\" (UID: \"a05a8a5d-2682-419c-abb4-3b4bb8920a68\") " pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.343468 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.500427 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.535803 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-config\") pod \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.535905 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-ovsdbserver-nb\") pod \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.535991 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-ovsdbserver-sb\") pod \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.536011 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-dns-svc\") pod \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.536061 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkqwc\" (UniqueName: \"kubernetes.io/projected/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-kube-api-access-tkqwc\") pod \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.536164 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-dns-swift-storage-0\") pod \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\" (UID: \"c0b2f060-1dbc-40ef-b32f-932d470fb4f6\") " Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.567903 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-kube-api-access-tkqwc" (OuterVolumeSpecName: "kube-api-access-tkqwc") pod "c0b2f060-1dbc-40ef-b32f-932d470fb4f6" (UID: "c0b2f060-1dbc-40ef-b32f-932d470fb4f6"). InnerVolumeSpecName "kube-api-access-tkqwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.627119 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c0b2f060-1dbc-40ef-b32f-932d470fb4f6" (UID: "c0b2f060-1dbc-40ef-b32f-932d470fb4f6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.635555 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-config" (OuterVolumeSpecName: "config") pod "c0b2f060-1dbc-40ef-b32f-932d470fb4f6" (UID: "c0b2f060-1dbc-40ef-b32f-932d470fb4f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.640765 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkqwc\" (UniqueName: \"kubernetes.io/projected/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-kube-api-access-tkqwc\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.640793 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.640803 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.665776 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0b2f060-1dbc-40ef-b32f-932d470fb4f6" (UID: "c0b2f060-1dbc-40ef-b32f-932d470fb4f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.685985 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c0b2f060-1dbc-40ef-b32f-932d470fb4f6" (UID: "c0b2f060-1dbc-40ef-b32f-932d470fb4f6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.697121 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c0b2f060-1dbc-40ef-b32f-932d470fb4f6" (UID: "c0b2f060-1dbc-40ef-b32f-932d470fb4f6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.742299 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.742324 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.742333 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0b2f060-1dbc-40ef-b32f-932d470fb4f6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:32 crc kubenswrapper[4907]: I1129 14:55:32.896850 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-7bd8m"] Nov 29 14:55:33 crc kubenswrapper[4907]: I1129 14:55:33.080746 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" event={"ID":"a05a8a5d-2682-419c-abb4-3b4bb8920a68","Type":"ContainerStarted","Data":"b4d01d6b5156586f21829a2bb27fe646f1ead6a0454db9f59742c00f35434b90"} Nov 29 14:55:33 crc kubenswrapper[4907]: I1129 14:55:33.083577 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" event={"ID":"c0b2f060-1dbc-40ef-b32f-932d470fb4f6","Type":"ContainerDied","Data":"ab576c0ed6897a68912b43abf5d358a7cbdf9cac1fad86068e10e0011f29de32"} Nov 29 14:55:33 crc kubenswrapper[4907]: I1129 14:55:33.083776 4907 scope.go:117] "RemoveContainer" containerID="dfc52db0ff989888f36a5a0810924b1055be1105eb4b41652ff730a6a4e347bc" Nov 29 14:55:33 crc kubenswrapper[4907]: I1129 14:55:33.083659 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8" Nov 29 14:55:33 crc kubenswrapper[4907]: I1129 14:55:33.127025 4907 scope.go:117] "RemoveContainer" containerID="3f285d25b679d349b99759a61c35f12cb993237886348e63d5daec8f051eada4" Nov 29 14:55:33 crc kubenswrapper[4907]: I1129 14:55:33.156736 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8"] Nov 29 14:55:33 crc kubenswrapper[4907]: I1129 14:55:33.167578 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-wtxh8"] Nov 29 14:55:34 crc kubenswrapper[4907]: I1129 14:55:34.099786 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-vz6m5" event={"ID":"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa","Type":"ContainerStarted","Data":"26cfe9ef18f81683b9a4dd6218379a9f055a8cd54775f69ad0055a6716b0d29b"} Nov 29 14:55:34 crc kubenswrapper[4907]: I1129 14:55:34.103026 4907 generic.go:334] "Generic (PLEG): container finished" podID="a05a8a5d-2682-419c-abb4-3b4bb8920a68" containerID="1f83132cb591da4cf14c4f2c718d73e97761c68b9f90ad3549a0ce2038593636" exitCode=0 Nov 29 14:55:34 crc kubenswrapper[4907]: I1129 14:55:34.103086 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" event={"ID":"a05a8a5d-2682-419c-abb4-3b4bb8920a68","Type":"ContainerDied","Data":"1f83132cb591da4cf14c4f2c718d73e97761c68b9f90ad3549a0ce2038593636"} Nov 29 14:55:34 crc kubenswrapper[4907]: I1129 14:55:34.140856 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-vz6m5" podStartSLOduration=2.189390793 podStartE2EDuration="39.140835963s" podCreationTimestamp="2025-11-29 14:54:55 +0000 UTC" firstStartedPulling="2025-11-29 14:54:56.730881092 +0000 UTC m=+1594.717718744" lastFinishedPulling="2025-11-29 14:55:33.682326262 +0000 UTC m=+1631.669163914" observedRunningTime="2025-11-29 14:55:34.126955983 +0000 UTC m=+1632.113793655" watchObservedRunningTime="2025-11-29 14:55:34.140835963 +0000 UTC m=+1632.127673615" Nov 29 14:55:34 crc kubenswrapper[4907]: I1129 14:55:34.493734 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0b2f060-1dbc-40ef-b32f-932d470fb4f6" path="/var/lib/kubelet/pods/c0b2f060-1dbc-40ef-b32f-932d470fb4f6/volumes" Nov 29 14:55:35 crc kubenswrapper[4907]: I1129 14:55:35.115397 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" event={"ID":"a05a8a5d-2682-419c-abb4-3b4bb8920a68","Type":"ContainerStarted","Data":"39ff78bc63a8f5f71239c7bc86ccdbf190e1f96da81c9f19dad72b07c96a9349"} Nov 29 14:55:35 crc kubenswrapper[4907]: I1129 14:55:35.115930 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:35 crc kubenswrapper[4907]: I1129 14:55:35.151679 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" podStartSLOduration=4.151658534 podStartE2EDuration="4.151658534s" podCreationTimestamp="2025-11-29 14:55:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:55:35.138216495 +0000 UTC m=+1633.125054187" watchObservedRunningTime="2025-11-29 14:55:35.151658534 +0000 UTC m=+1633.138496196" Nov 29 14:55:36 crc kubenswrapper[4907]: I1129 14:55:36.134196 4907 generic.go:334] "Generic (PLEG): container finished" podID="4cc96345-5c70-4c46-8ec2-8c53e2fe35aa" containerID="26cfe9ef18f81683b9a4dd6218379a9f055a8cd54775f69ad0055a6716b0d29b" exitCode=0 Nov 29 14:55:36 crc kubenswrapper[4907]: I1129 14:55:36.134256 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-vz6m5" event={"ID":"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa","Type":"ContainerDied","Data":"26cfe9ef18f81683b9a4dd6218379a9f055a8cd54775f69ad0055a6716b0d29b"} Nov 29 14:55:37 crc kubenswrapper[4907]: I1129 14:55:37.479589 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:55:37 crc kubenswrapper[4907]: E1129 14:55:37.479938 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:55:37 crc kubenswrapper[4907]: I1129 14:55:37.497048 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 29 14:55:37 crc kubenswrapper[4907]: I1129 14:55:37.670456 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-vz6m5" Nov 29 14:55:37 crc kubenswrapper[4907]: I1129 14:55:37.775223 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-config-data\") pod \"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa\" (UID: \"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa\") " Nov 29 14:55:37 crc kubenswrapper[4907]: I1129 14:55:37.776336 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-combined-ca-bundle\") pod \"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa\" (UID: \"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa\") " Nov 29 14:55:37 crc kubenswrapper[4907]: I1129 14:55:37.776466 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxtwg\" (UniqueName: \"kubernetes.io/projected/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-kube-api-access-lxtwg\") pod \"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa\" (UID: \"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa\") " Nov 29 14:55:37 crc kubenswrapper[4907]: I1129 14:55:37.781758 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-kube-api-access-lxtwg" (OuterVolumeSpecName: "kube-api-access-lxtwg") pod "4cc96345-5c70-4c46-8ec2-8c53e2fe35aa" (UID: "4cc96345-5c70-4c46-8ec2-8c53e2fe35aa"). InnerVolumeSpecName "kube-api-access-lxtwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:55:37 crc kubenswrapper[4907]: I1129 14:55:37.817894 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cc96345-5c70-4c46-8ec2-8c53e2fe35aa" (UID: "4cc96345-5c70-4c46-8ec2-8c53e2fe35aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:55:37 crc kubenswrapper[4907]: I1129 14:55:37.880107 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:37 crc kubenswrapper[4907]: I1129 14:55:37.880155 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxtwg\" (UniqueName: \"kubernetes.io/projected/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-kube-api-access-lxtwg\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:37 crc kubenswrapper[4907]: I1129 14:55:37.896615 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-config-data" (OuterVolumeSpecName: "config-data") pod "4cc96345-5c70-4c46-8ec2-8c53e2fe35aa" (UID: "4cc96345-5c70-4c46-8ec2-8c53e2fe35aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:55:37 crc kubenswrapper[4907]: I1129 14:55:37.982217 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:38 crc kubenswrapper[4907]: I1129 14:55:38.166107 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-vz6m5" event={"ID":"4cc96345-5c70-4c46-8ec2-8c53e2fe35aa","Type":"ContainerDied","Data":"4519476e7d3ab717fa5fce2df3ff0a889241b7a7a6677d8d0b28477addc706ab"} Nov 29 14:55:38 crc kubenswrapper[4907]: I1129 14:55:38.166164 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4519476e7d3ab717fa5fce2df3ff0a889241b7a7a6677d8d0b28477addc706ab" Nov 29 14:55:38 crc kubenswrapper[4907]: I1129 14:55:38.166249 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-vz6m5" Nov 29 14:55:38 crc kubenswrapper[4907]: I1129 14:55:38.171934 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a","Type":"ContainerStarted","Data":"0d8fd20ab03310774b0b1f417420d9798738bd28e5b3b4178d3f060b1981b887"} Nov 29 14:55:38 crc kubenswrapper[4907]: I1129 14:55:38.240990 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.276314235 podStartE2EDuration="38.240963547s" podCreationTimestamp="2025-11-29 14:55:00 +0000 UTC" firstStartedPulling="2025-11-29 14:55:01.749856072 +0000 UTC m=+1599.736693724" lastFinishedPulling="2025-11-29 14:55:37.714505384 +0000 UTC m=+1635.701343036" observedRunningTime="2025-11-29 14:55:38.21688659 +0000 UTC m=+1636.203724262" watchObservedRunningTime="2025-11-29 14:55:38.240963547 +0000 UTC m=+1636.227801209" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.091313 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-57dd6cc64-twm82"] Nov 29 14:55:39 crc kubenswrapper[4907]: E1129 14:55:39.092167 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cc96345-5c70-4c46-8ec2-8c53e2fe35aa" containerName="heat-db-sync" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.092184 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cc96345-5c70-4c46-8ec2-8c53e2fe35aa" containerName="heat-db-sync" Nov 29 14:55:39 crc kubenswrapper[4907]: E1129 14:55:39.092244 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b2f060-1dbc-40ef-b32f-932d470fb4f6" containerName="dnsmasq-dns" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.092252 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b2f060-1dbc-40ef-b32f-932d470fb4f6" containerName="dnsmasq-dns" Nov 29 14:55:39 crc kubenswrapper[4907]: E1129 14:55:39.092269 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b2f060-1dbc-40ef-b32f-932d470fb4f6" containerName="init" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.092277 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b2f060-1dbc-40ef-b32f-932d470fb4f6" containerName="init" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.092596 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0b2f060-1dbc-40ef-b32f-932d470fb4f6" containerName="dnsmasq-dns" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.092617 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cc96345-5c70-4c46-8ec2-8c53e2fe35aa" containerName="heat-db-sync" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.093617 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.110464 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-57dd6cc64-twm82"] Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.172001 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5bcf4f8684-6877z"] Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.173508 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.193977 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5bcf4f8684-6877z"] Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.199491 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7b4cb5586d-96kjn"] Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.201141 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.217429 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7b4cb5586d-96kjn"] Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.219952 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d44a8a1-d845-407d-8769-8c0ccbebc4d2-config-data\") pod \"heat-engine-57dd6cc64-twm82\" (UID: \"5d44a8a1-d845-407d-8769-8c0ccbebc4d2\") " pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.219993 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f688c14e-91bc-4525-acd8-a0c9d440dff4-public-tls-certs\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.220062 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f688c14e-91bc-4525-acd8-a0c9d440dff4-combined-ca-bundle\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.220099 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d44a8a1-d845-407d-8769-8c0ccbebc4d2-combined-ca-bundle\") pod \"heat-engine-57dd6cc64-twm82\" (UID: \"5d44a8a1-d845-407d-8769-8c0ccbebc4d2\") " pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.220125 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d44a8a1-d845-407d-8769-8c0ccbebc4d2-config-data-custom\") pod \"heat-engine-57dd6cc64-twm82\" (UID: \"5d44a8a1-d845-407d-8769-8c0ccbebc4d2\") " pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.220157 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsbx7\" (UniqueName: \"kubernetes.io/projected/f688c14e-91bc-4525-acd8-a0c9d440dff4-kube-api-access-gsbx7\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.220174 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f688c14e-91bc-4525-acd8-a0c9d440dff4-config-data\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.220209 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hgxk\" (UniqueName: \"kubernetes.io/projected/5d44a8a1-d845-407d-8769-8c0ccbebc4d2-kube-api-access-6hgxk\") pod \"heat-engine-57dd6cc64-twm82\" (UID: \"5d44a8a1-d845-407d-8769-8c0ccbebc4d2\") " pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.220244 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f688c14e-91bc-4525-acd8-a0c9d440dff4-internal-tls-certs\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.220287 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f688c14e-91bc-4525-acd8-a0c9d440dff4-config-data-custom\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.322959 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f688c14e-91bc-4525-acd8-a0c9d440dff4-config-data-custom\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.323165 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac083b7c-e604-4aa0-98ed-66668134ad44-public-tls-certs\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.324008 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d44a8a1-d845-407d-8769-8c0ccbebc4d2-config-data\") pod \"heat-engine-57dd6cc64-twm82\" (UID: \"5d44a8a1-d845-407d-8769-8c0ccbebc4d2\") " pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.324049 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f688c14e-91bc-4525-acd8-a0c9d440dff4-public-tls-certs\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.324108 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rzkc\" (UniqueName: \"kubernetes.io/projected/ac083b7c-e604-4aa0-98ed-66668134ad44-kube-api-access-8rzkc\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.324232 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac083b7c-e604-4aa0-98ed-66668134ad44-combined-ca-bundle\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.324832 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f688c14e-91bc-4525-acd8-a0c9d440dff4-combined-ca-bundle\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.324944 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d44a8a1-d845-407d-8769-8c0ccbebc4d2-combined-ca-bundle\") pod \"heat-engine-57dd6cc64-twm82\" (UID: \"5d44a8a1-d845-407d-8769-8c0ccbebc4d2\") " pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.324999 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d44a8a1-d845-407d-8769-8c0ccbebc4d2-config-data-custom\") pod \"heat-engine-57dd6cc64-twm82\" (UID: \"5d44a8a1-d845-407d-8769-8c0ccbebc4d2\") " pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.325025 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac083b7c-e604-4aa0-98ed-66668134ad44-internal-tls-certs\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.325064 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsbx7\" (UniqueName: \"kubernetes.io/projected/f688c14e-91bc-4525-acd8-a0c9d440dff4-kube-api-access-gsbx7\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.325089 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f688c14e-91bc-4525-acd8-a0c9d440dff4-config-data\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.325150 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac083b7c-e604-4aa0-98ed-66668134ad44-config-data\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.325177 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hgxk\" (UniqueName: \"kubernetes.io/projected/5d44a8a1-d845-407d-8769-8c0ccbebc4d2-kube-api-access-6hgxk\") pod \"heat-engine-57dd6cc64-twm82\" (UID: \"5d44a8a1-d845-407d-8769-8c0ccbebc4d2\") " pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.325224 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac083b7c-e604-4aa0-98ed-66668134ad44-config-data-custom\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.325264 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f688c14e-91bc-4525-acd8-a0c9d440dff4-internal-tls-certs\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.338428 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f688c14e-91bc-4525-acd8-a0c9d440dff4-internal-tls-certs\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.338469 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d44a8a1-d845-407d-8769-8c0ccbebc4d2-combined-ca-bundle\") pod \"heat-engine-57dd6cc64-twm82\" (UID: \"5d44a8a1-d845-407d-8769-8c0ccbebc4d2\") " pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.338650 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f688c14e-91bc-4525-acd8-a0c9d440dff4-combined-ca-bundle\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.339326 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d44a8a1-d845-407d-8769-8c0ccbebc4d2-config-data\") pod \"heat-engine-57dd6cc64-twm82\" (UID: \"5d44a8a1-d845-407d-8769-8c0ccbebc4d2\") " pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.339384 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d44a8a1-d845-407d-8769-8c0ccbebc4d2-config-data-custom\") pod \"heat-engine-57dd6cc64-twm82\" (UID: \"5d44a8a1-d845-407d-8769-8c0ccbebc4d2\") " pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.339497 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f688c14e-91bc-4525-acd8-a0c9d440dff4-config-data\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.341500 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f688c14e-91bc-4525-acd8-a0c9d440dff4-public-tls-certs\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.348924 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f688c14e-91bc-4525-acd8-a0c9d440dff4-config-data-custom\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.350652 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsbx7\" (UniqueName: \"kubernetes.io/projected/f688c14e-91bc-4525-acd8-a0c9d440dff4-kube-api-access-gsbx7\") pod \"heat-api-5bcf4f8684-6877z\" (UID: \"f688c14e-91bc-4525-acd8-a0c9d440dff4\") " pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.350930 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hgxk\" (UniqueName: \"kubernetes.io/projected/5d44a8a1-d845-407d-8769-8c0ccbebc4d2-kube-api-access-6hgxk\") pod \"heat-engine-57dd6cc64-twm82\" (UID: \"5d44a8a1-d845-407d-8769-8c0ccbebc4d2\") " pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.422567 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.434683 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac083b7c-e604-4aa0-98ed-66668134ad44-internal-tls-certs\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.434988 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac083b7c-e604-4aa0-98ed-66668134ad44-config-data\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.435124 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac083b7c-e604-4aa0-98ed-66668134ad44-config-data-custom\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.435353 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac083b7c-e604-4aa0-98ed-66668134ad44-public-tls-certs\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.435496 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rzkc\" (UniqueName: \"kubernetes.io/projected/ac083b7c-e604-4aa0-98ed-66668134ad44-kube-api-access-8rzkc\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.435629 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac083b7c-e604-4aa0-98ed-66668134ad44-combined-ca-bundle\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.438634 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac083b7c-e604-4aa0-98ed-66668134ad44-internal-tls-certs\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.439968 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac083b7c-e604-4aa0-98ed-66668134ad44-config-data-custom\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.439967 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac083b7c-e604-4aa0-98ed-66668134ad44-combined-ca-bundle\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.440854 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac083b7c-e604-4aa0-98ed-66668134ad44-config-data\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.444980 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac083b7c-e604-4aa0-98ed-66668134ad44-public-tls-certs\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.472841 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rzkc\" (UniqueName: \"kubernetes.io/projected/ac083b7c-e604-4aa0-98ed-66668134ad44-kube-api-access-8rzkc\") pod \"heat-cfnapi-7b4cb5586d-96kjn\" (UID: \"ac083b7c-e604-4aa0-98ed-66668134ad44\") " pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.505994 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:39 crc kubenswrapper[4907]: I1129 14:55:39.524219 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:40 crc kubenswrapper[4907]: I1129 14:55:40.051180 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-57dd6cc64-twm82"] Nov 29 14:55:40 crc kubenswrapper[4907]: I1129 14:55:40.140038 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7b4cb5586d-96kjn"] Nov 29 14:55:40 crc kubenswrapper[4907]: W1129 14:55:40.141430 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac083b7c_e604_4aa0_98ed_66668134ad44.slice/crio-d72db08d6c50d9cde018442e9a992bfe049801a8f43d89f33b2b0bc15a41ad1c WatchSource:0}: Error finding container d72db08d6c50d9cde018442e9a992bfe049801a8f43d89f33b2b0bc15a41ad1c: Status 404 returned error can't find the container with id d72db08d6c50d9cde018442e9a992bfe049801a8f43d89f33b2b0bc15a41ad1c Nov 29 14:55:40 crc kubenswrapper[4907]: I1129 14:55:40.152114 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5bcf4f8684-6877z"] Nov 29 14:55:40 crc kubenswrapper[4907]: I1129 14:55:40.225911 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" event={"ID":"ac083b7c-e604-4aa0-98ed-66668134ad44","Type":"ContainerStarted","Data":"d72db08d6c50d9cde018442e9a992bfe049801a8f43d89f33b2b0bc15a41ad1c"} Nov 29 14:55:40 crc kubenswrapper[4907]: I1129 14:55:40.227263 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5bcf4f8684-6877z" event={"ID":"f688c14e-91bc-4525-acd8-a0c9d440dff4","Type":"ContainerStarted","Data":"99c1770995b612bb7ac6b3c6a8b7face70b3679268e3cde532df5496226dcb18"} Nov 29 14:55:40 crc kubenswrapper[4907]: I1129 14:55:40.228698 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-57dd6cc64-twm82" event={"ID":"5d44a8a1-d845-407d-8769-8c0ccbebc4d2","Type":"ContainerStarted","Data":"9eca8c1ed17f300a06e1b63a6d3dfdb5dd6384ec6ccf2e2cf52ec50bc07c901e"} Nov 29 14:55:41 crc kubenswrapper[4907]: I1129 14:55:41.246383 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-57dd6cc64-twm82" event={"ID":"5d44a8a1-d845-407d-8769-8c0ccbebc4d2","Type":"ContainerStarted","Data":"ad29a98310aa3a2207d051a076601ebf8e414a95023fee6a0f7c0662042d823a"} Nov 29 14:55:41 crc kubenswrapper[4907]: I1129 14:55:41.247597 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:41 crc kubenswrapper[4907]: I1129 14:55:41.290389 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-57dd6cc64-twm82" podStartSLOduration=2.290366287 podStartE2EDuration="2.290366287s" podCreationTimestamp="2025-11-29 14:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:55:41.278989287 +0000 UTC m=+1639.265826939" watchObservedRunningTime="2025-11-29 14:55:41.290366287 +0000 UTC m=+1639.277203949" Nov 29 14:55:42 crc kubenswrapper[4907]: I1129 14:55:42.346623 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6df4f56c-7bd8m" Nov 29 14:55:42 crc kubenswrapper[4907]: I1129 14:55:42.401791 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-js795"] Nov 29 14:55:42 crc kubenswrapper[4907]: I1129 14:55:42.402049 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d84b4d45c-js795" podUID="6fcc0554-5a79-4c86-b831-16dfca6c2746" containerName="dnsmasq-dns" containerID="cri-o://c38575014ce2e95fe8364d0cc96e5a1f33522164ea727b90f4e7ccb9ca13f196" gracePeriod=10 Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.094142 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.131225 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-ovsdbserver-sb\") pod \"6fcc0554-5a79-4c86-b831-16dfca6c2746\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.131265 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-dns-swift-storage-0\") pod \"6fcc0554-5a79-4c86-b831-16dfca6c2746\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.131337 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-ovsdbserver-nb\") pod \"6fcc0554-5a79-4c86-b831-16dfca6c2746\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.131393 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-openstack-edpm-ipam\") pod \"6fcc0554-5a79-4c86-b831-16dfca6c2746\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.131415 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-dns-svc\") pod \"6fcc0554-5a79-4c86-b831-16dfca6c2746\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.131529 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-config\") pod \"6fcc0554-5a79-4c86-b831-16dfca6c2746\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.131680 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgvpr\" (UniqueName: \"kubernetes.io/projected/6fcc0554-5a79-4c86-b831-16dfca6c2746-kube-api-access-qgvpr\") pod \"6fcc0554-5a79-4c86-b831-16dfca6c2746\" (UID: \"6fcc0554-5a79-4c86-b831-16dfca6c2746\") " Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.165088 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fcc0554-5a79-4c86-b831-16dfca6c2746-kube-api-access-qgvpr" (OuterVolumeSpecName: "kube-api-access-qgvpr") pod "6fcc0554-5a79-4c86-b831-16dfca6c2746" (UID: "6fcc0554-5a79-4c86-b831-16dfca6c2746"). InnerVolumeSpecName "kube-api-access-qgvpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.235222 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgvpr\" (UniqueName: \"kubernetes.io/projected/6fcc0554-5a79-4c86-b831-16dfca6c2746-kube-api-access-qgvpr\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.280165 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-config" (OuterVolumeSpecName: "config") pod "6fcc0554-5a79-4c86-b831-16dfca6c2746" (UID: "6fcc0554-5a79-4c86-b831-16dfca6c2746"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.282128 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6fcc0554-5a79-4c86-b831-16dfca6c2746" (UID: "6fcc0554-5a79-4c86-b831-16dfca6c2746"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.305540 4907 generic.go:334] "Generic (PLEG): container finished" podID="6fcc0554-5a79-4c86-b831-16dfca6c2746" containerID="c38575014ce2e95fe8364d0cc96e5a1f33522164ea727b90f4e7ccb9ca13f196" exitCode=0 Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.305631 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-js795" event={"ID":"6fcc0554-5a79-4c86-b831-16dfca6c2746","Type":"ContainerDied","Data":"c38575014ce2e95fe8364d0cc96e5a1f33522164ea727b90f4e7ccb9ca13f196"} Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.305667 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-js795" event={"ID":"6fcc0554-5a79-4c86-b831-16dfca6c2746","Type":"ContainerDied","Data":"829be6df668c3809f12eeb5692a86bb676ddb6d1c240242e20fe3fa6e43f96b5"} Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.305669 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-js795" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.306802 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5bcf4f8684-6877z" event={"ID":"f688c14e-91bc-4525-acd8-a0c9d440dff4","Type":"ContainerStarted","Data":"be90d55bd3b973dffa8c76fa5d2232c876a82d6e2cf61fbce8c010ffba9f1309"} Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.307013 4907 scope.go:117] "RemoveContainer" containerID="c38575014ce2e95fe8364d0cc96e5a1f33522164ea727b90f4e7ccb9ca13f196" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.308212 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.311832 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" event={"ID":"ac083b7c-e604-4aa0-98ed-66668134ad44","Type":"ContainerStarted","Data":"4244d58cec46ad31bda682752f2334e5e6a279392eed1d725f8a275439d9933f"} Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.312464 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.336685 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5bcf4f8684-6877z" podStartSLOduration=2.402342567 podStartE2EDuration="4.336664832s" podCreationTimestamp="2025-11-29 14:55:39 +0000 UTC" firstStartedPulling="2025-11-29 14:55:40.168536852 +0000 UTC m=+1638.155374504" lastFinishedPulling="2025-11-29 14:55:42.102859117 +0000 UTC m=+1640.089696769" observedRunningTime="2025-11-29 14:55:43.322238146 +0000 UTC m=+1641.309075798" watchObservedRunningTime="2025-11-29 14:55:43.336664832 +0000 UTC m=+1641.323502484" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.338546 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.338663 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-config\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.377342 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6fcc0554-5a79-4c86-b831-16dfca6c2746" (UID: "6fcc0554-5a79-4c86-b831-16dfca6c2746"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.389991 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "6fcc0554-5a79-4c86-b831-16dfca6c2746" (UID: "6fcc0554-5a79-4c86-b831-16dfca6c2746"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.413681 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6fcc0554-5a79-4c86-b831-16dfca6c2746" (UID: "6fcc0554-5a79-4c86-b831-16dfca6c2746"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.452751 4907 scope.go:117] "RemoveContainer" containerID="0cc4c7c4f8dcd1c97b3711440805c4a634f33275e1fbae94e456ab49050d6547" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.460549 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6fcc0554-5a79-4c86-b831-16dfca6c2746" (UID: "6fcc0554-5a79-4c86-b831-16dfca6c2746"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.474785 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.490289 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" podStartSLOduration=2.534148516 podStartE2EDuration="4.490265104s" podCreationTimestamp="2025-11-29 14:55:39 +0000 UTC" firstStartedPulling="2025-11-29 14:55:40.146749169 +0000 UTC m=+1638.133586821" lastFinishedPulling="2025-11-29 14:55:42.102865757 +0000 UTC m=+1640.089703409" observedRunningTime="2025-11-29 14:55:43.473044049 +0000 UTC m=+1641.459881701" watchObservedRunningTime="2025-11-29 14:55:43.490265104 +0000 UTC m=+1641.477102756" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.520819 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.522192 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.595629 4907 scope.go:117] "RemoveContainer" containerID="c38575014ce2e95fe8364d0cc96e5a1f33522164ea727b90f4e7ccb9ca13f196" Nov 29 14:55:43 crc kubenswrapper[4907]: E1129 14:55:43.606292 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c38575014ce2e95fe8364d0cc96e5a1f33522164ea727b90f4e7ccb9ca13f196\": container with ID starting with c38575014ce2e95fe8364d0cc96e5a1f33522164ea727b90f4e7ccb9ca13f196 not found: ID does not exist" containerID="c38575014ce2e95fe8364d0cc96e5a1f33522164ea727b90f4e7ccb9ca13f196" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.606335 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c38575014ce2e95fe8364d0cc96e5a1f33522164ea727b90f4e7ccb9ca13f196"} err="failed to get container status \"c38575014ce2e95fe8364d0cc96e5a1f33522164ea727b90f4e7ccb9ca13f196\": rpc error: code = NotFound desc = could not find container \"c38575014ce2e95fe8364d0cc96e5a1f33522164ea727b90f4e7ccb9ca13f196\": container with ID starting with c38575014ce2e95fe8364d0cc96e5a1f33522164ea727b90f4e7ccb9ca13f196 not found: ID does not exist" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.606360 4907 scope.go:117] "RemoveContainer" containerID="0cc4c7c4f8dcd1c97b3711440805c4a634f33275e1fbae94e456ab49050d6547" Nov 29 14:55:43 crc kubenswrapper[4907]: E1129 14:55:43.610551 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc4c7c4f8dcd1c97b3711440805c4a634f33275e1fbae94e456ab49050d6547\": container with ID starting with 0cc4c7c4f8dcd1c97b3711440805c4a634f33275e1fbae94e456ab49050d6547 not found: ID does not exist" containerID="0cc4c7c4f8dcd1c97b3711440805c4a634f33275e1fbae94e456ab49050d6547" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.610584 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc4c7c4f8dcd1c97b3711440805c4a634f33275e1fbae94e456ab49050d6547"} err="failed to get container status \"0cc4c7c4f8dcd1c97b3711440805c4a634f33275e1fbae94e456ab49050d6547\": rpc error: code = NotFound desc = could not find container \"0cc4c7c4f8dcd1c97b3711440805c4a634f33275e1fbae94e456ab49050d6547\": container with ID starting with 0cc4c7c4f8dcd1c97b3711440805c4a634f33275e1fbae94e456ab49050d6547 not found: ID does not exist" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.627422 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fcc0554-5a79-4c86-b831-16dfca6c2746-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.680901 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-js795"] Nov 29 14:55:43 crc kubenswrapper[4907]: I1129 14:55:43.695899 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-js795"] Nov 29 14:55:44 crc kubenswrapper[4907]: I1129 14:55:44.493379 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fcc0554-5a79-4c86-b831-16dfca6c2746" path="/var/lib/kubelet/pods/6fcc0554-5a79-4c86-b831-16dfca6c2746/volumes" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.005324 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj"] Nov 29 14:55:47 crc kubenswrapper[4907]: E1129 14:55:47.006393 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fcc0554-5a79-4c86-b831-16dfca6c2746" containerName="init" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.006413 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fcc0554-5a79-4c86-b831-16dfca6c2746" containerName="init" Nov 29 14:55:47 crc kubenswrapper[4907]: E1129 14:55:47.006517 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fcc0554-5a79-4c86-b831-16dfca6c2746" containerName="dnsmasq-dns" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.006526 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fcc0554-5a79-4c86-b831-16dfca6c2746" containerName="dnsmasq-dns" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.006769 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fcc0554-5a79-4c86-b831-16dfca6c2746" containerName="dnsmasq-dns" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.007640 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.009367 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.009627 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.009998 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.010123 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.024750 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj"] Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.104867 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.105199 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm5x2\" (UniqueName: \"kubernetes.io/projected/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-kube-api-access-mm5x2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.105287 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.105325 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.207561 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.207667 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.207846 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.208007 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm5x2\" (UniqueName: \"kubernetes.io/projected/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-kube-api-access-mm5x2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.213146 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.217378 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.219431 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.223396 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm5x2\" (UniqueName: \"kubernetes.io/projected/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-kube-api-access-mm5x2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:55:47 crc kubenswrapper[4907]: I1129 14:55:47.331270 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:55:48 crc kubenswrapper[4907]: I1129 14:55:48.040881 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 14:55:48 crc kubenswrapper[4907]: I1129 14:55:48.040990 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj"] Nov 29 14:55:48 crc kubenswrapper[4907]: I1129 14:55:48.376820 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" event={"ID":"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3","Type":"ContainerStarted","Data":"8281ef03dc0ed23bcf232f2602d62f349461652ca0dd071d998f1b00eaea156b"} Nov 29 14:55:51 crc kubenswrapper[4907]: I1129 14:55:51.252034 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7b4cb5586d-96kjn" Nov 29 14:55:51 crc kubenswrapper[4907]: I1129 14:55:51.253595 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5bcf4f8684-6877z" Nov 29 14:55:51 crc kubenswrapper[4907]: I1129 14:55:51.350364 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-9f67f96b7-8krnv"] Nov 29 14:55:51 crc kubenswrapper[4907]: I1129 14:55:51.350625 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-9f67f96b7-8krnv" podUID="62227625-11c5-4d0a-b990-a1995069e259" containerName="heat-cfnapi" containerID="cri-o://289d1d6822408120efa26bdf6670d03e6d1d5d6da01a556794102056704114d8" gracePeriod=60 Nov 29 14:55:51 crc kubenswrapper[4907]: I1129 14:55:51.386047 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-54546bdf79-77p2l"] Nov 29 14:55:51 crc kubenswrapper[4907]: I1129 14:55:51.386546 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-54546bdf79-77p2l" podUID="2ea22f3e-15c5-4f6e-9269-1da424d29342" containerName="heat-api" containerID="cri-o://35e7b30e09c58a06501159656c980ee970d673c53b74f12ff1639fc6aa15e539" gracePeriod=60 Nov 29 14:55:52 crc kubenswrapper[4907]: I1129 14:55:52.504964 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:55:52 crc kubenswrapper[4907]: E1129 14:55:52.505513 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:55:54 crc kubenswrapper[4907]: I1129 14:55:54.553941 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-9f67f96b7-8krnv" podUID="62227625-11c5-4d0a-b990-a1995069e259" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.219:8000/healthcheck\": read tcp 10.217.0.2:34146->10.217.0.219:8000: read: connection reset by peer" Nov 29 14:55:54 crc kubenswrapper[4907]: I1129 14:55:54.570190 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-54546bdf79-77p2l" podUID="2ea22f3e-15c5-4f6e-9269-1da424d29342" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.220:8004/healthcheck\": read tcp 10.217.0.2:45084->10.217.0.220:8004: read: connection reset by peer" Nov 29 14:55:55 crc kubenswrapper[4907]: I1129 14:55:55.504830 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54546bdf79-77p2l" event={"ID":"2ea22f3e-15c5-4f6e-9269-1da424d29342","Type":"ContainerDied","Data":"35e7b30e09c58a06501159656c980ee970d673c53b74f12ff1639fc6aa15e539"} Nov 29 14:55:55 crc kubenswrapper[4907]: I1129 14:55:55.504808 4907 generic.go:334] "Generic (PLEG): container finished" podID="2ea22f3e-15c5-4f6e-9269-1da424d29342" containerID="35e7b30e09c58a06501159656c980ee970d673c53b74f12ff1639fc6aa15e539" exitCode=0 Nov 29 14:55:55 crc kubenswrapper[4907]: I1129 14:55:55.508555 4907 generic.go:334] "Generic (PLEG): container finished" podID="62227625-11c5-4d0a-b990-a1995069e259" containerID="289d1d6822408120efa26bdf6670d03e6d1d5d6da01a556794102056704114d8" exitCode=0 Nov 29 14:55:55 crc kubenswrapper[4907]: I1129 14:55:55.508598 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-9f67f96b7-8krnv" event={"ID":"62227625-11c5-4d0a-b990-a1995069e259","Type":"ContainerDied","Data":"289d1d6822408120efa26bdf6670d03e6d1d5d6da01a556794102056704114d8"} Nov 29 14:55:56 crc kubenswrapper[4907]: I1129 14:55:56.523638 4907 generic.go:334] "Generic (PLEG): container finished" podID="63f606f9-1313-4d39-8f54-78078cbd256e" containerID="c3dcb5ec5ed4edcff01d567d97d496be235dac203446aaf697d44979877f223a" exitCode=0 Nov 29 14:55:56 crc kubenswrapper[4907]: I1129 14:55:56.523705 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63f606f9-1313-4d39-8f54-78078cbd256e","Type":"ContainerDied","Data":"c3dcb5ec5ed4edcff01d567d97d496be235dac203446aaf697d44979877f223a"} Nov 29 14:55:56 crc kubenswrapper[4907]: I1129 14:55:56.525500 4907 generic.go:334] "Generic (PLEG): container finished" podID="07559e4d-3526-441a-a08d-e11c60e80761" containerID="14c7d5a7f9daf26e95df26a2ead765c77633ce18da538f878087ed257830bba7" exitCode=0 Nov 29 14:55:56 crc kubenswrapper[4907]: I1129 14:55:56.525541 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"07559e4d-3526-441a-a08d-e11c60e80761","Type":"ContainerDied","Data":"14c7d5a7f9daf26e95df26a2ead765c77633ce18da538f878087ed257830bba7"} Nov 29 14:55:56 crc kubenswrapper[4907]: I1129 14:55:56.854937 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-9f67f96b7-8krnv" podUID="62227625-11c5-4d0a-b990-a1995069e259" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.219:8000/healthcheck\": dial tcp 10.217.0.219:8000: connect: connection refused" Nov 29 14:55:56 crc kubenswrapper[4907]: I1129 14:55:56.866266 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-54546bdf79-77p2l" podUID="2ea22f3e-15c5-4f6e-9269-1da424d29342" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.220:8004/healthcheck\": dial tcp 10.217.0.220:8004: connect: connection refused" Nov 29 14:55:59 crc kubenswrapper[4907]: I1129 14:55:59.495166 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-57dd6cc64-twm82" Nov 29 14:55:59 crc kubenswrapper[4907]: I1129 14:55:59.605051 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-cdc66dd5-7pf8b"] Nov 29 14:55:59 crc kubenswrapper[4907]: I1129 14:55:59.605362 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-cdc66dd5-7pf8b" podUID="08229553-e114-45c9-a109-b01223241912" containerName="heat-engine" containerID="cri-o://297d0cd5ee3cbab6d0c630d62813378481056594efe1d066186f89706e7980ac" gracePeriod=60 Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.008457 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.097517 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-config-data-custom\") pod \"2ea22f3e-15c5-4f6e-9269-1da424d29342\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.097635 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-internal-tls-certs\") pod \"2ea22f3e-15c5-4f6e-9269-1da424d29342\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.097661 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htr2k\" (UniqueName: \"kubernetes.io/projected/2ea22f3e-15c5-4f6e-9269-1da424d29342-kube-api-access-htr2k\") pod \"2ea22f3e-15c5-4f6e-9269-1da424d29342\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.097770 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-config-data\") pod \"2ea22f3e-15c5-4f6e-9269-1da424d29342\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.097983 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-public-tls-certs\") pod \"2ea22f3e-15c5-4f6e-9269-1da424d29342\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.098045 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-combined-ca-bundle\") pod \"2ea22f3e-15c5-4f6e-9269-1da424d29342\" (UID: \"2ea22f3e-15c5-4f6e-9269-1da424d29342\") " Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.111547 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea22f3e-15c5-4f6e-9269-1da424d29342-kube-api-access-htr2k" (OuterVolumeSpecName: "kube-api-access-htr2k") pod "2ea22f3e-15c5-4f6e-9269-1da424d29342" (UID: "2ea22f3e-15c5-4f6e-9269-1da424d29342"). InnerVolumeSpecName "kube-api-access-htr2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.122566 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2ea22f3e-15c5-4f6e-9269-1da424d29342" (UID: "2ea22f3e-15c5-4f6e-9269-1da424d29342"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.167983 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.198420 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ea22f3e-15c5-4f6e-9269-1da424d29342" (UID: "2ea22f3e-15c5-4f6e-9269-1da424d29342"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.201432 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.201551 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.201667 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htr2k\" (UniqueName: \"kubernetes.io/projected/2ea22f3e-15c5-4f6e-9269-1da424d29342-kube-api-access-htr2k\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.240461 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2ea22f3e-15c5-4f6e-9269-1da424d29342" (UID: "2ea22f3e-15c5-4f6e-9269-1da424d29342"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.305662 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-config-data-custom\") pod \"62227625-11c5-4d0a-b990-a1995069e259\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.305768 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbp6w\" (UniqueName: \"kubernetes.io/projected/62227625-11c5-4d0a-b990-a1995069e259-kube-api-access-vbp6w\") pod \"62227625-11c5-4d0a-b990-a1995069e259\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.305804 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-config-data\") pod \"62227625-11c5-4d0a-b990-a1995069e259\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.305835 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-public-tls-certs\") pod \"62227625-11c5-4d0a-b990-a1995069e259\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.305876 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-combined-ca-bundle\") pod \"62227625-11c5-4d0a-b990-a1995069e259\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.305939 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-internal-tls-certs\") pod \"62227625-11c5-4d0a-b990-a1995069e259\" (UID: \"62227625-11c5-4d0a-b990-a1995069e259\") " Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.306481 4907 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.312828 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "62227625-11c5-4d0a-b990-a1995069e259" (UID: "62227625-11c5-4d0a-b990-a1995069e259"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.313492 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2ea22f3e-15c5-4f6e-9269-1da424d29342" (UID: "2ea22f3e-15c5-4f6e-9269-1da424d29342"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.317424 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-config-data" (OuterVolumeSpecName: "config-data") pod "2ea22f3e-15c5-4f6e-9269-1da424d29342" (UID: "2ea22f3e-15c5-4f6e-9269-1da424d29342"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.328595 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62227625-11c5-4d0a-b990-a1995069e259-kube-api-access-vbp6w" (OuterVolumeSpecName: "kube-api-access-vbp6w") pod "62227625-11c5-4d0a-b990-a1995069e259" (UID: "62227625-11c5-4d0a-b990-a1995069e259"). InnerVolumeSpecName "kube-api-access-vbp6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.353636 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62227625-11c5-4d0a-b990-a1995069e259" (UID: "62227625-11c5-4d0a-b990-a1995069e259"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.408159 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.408190 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.408201 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbp6w\" (UniqueName: \"kubernetes.io/projected/62227625-11c5-4d0a-b990-a1995069e259-kube-api-access-vbp6w\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.408211 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.408220 4907 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ea22f3e-15c5-4f6e-9269-1da424d29342-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.409611 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-config-data" (OuterVolumeSpecName: "config-data") pod "62227625-11c5-4d0a-b990-a1995069e259" (UID: "62227625-11c5-4d0a-b990-a1995069e259"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.420451 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "62227625-11c5-4d0a-b990-a1995069e259" (UID: "62227625-11c5-4d0a-b990-a1995069e259"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.454670 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "62227625-11c5-4d0a-b990-a1995069e259" (UID: "62227625-11c5-4d0a-b990-a1995069e259"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.510006 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.510033 4907 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.510043 4907 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62227625-11c5-4d0a-b990-a1995069e259-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.588690 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-9f67f96b7-8krnv" event={"ID":"62227625-11c5-4d0a-b990-a1995069e259","Type":"ContainerDied","Data":"7abf1bc478f3716cc213a7aee10c776620cb5f105fbfb112eaa3f99bb9ca8d17"} Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.588728 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-9f67f96b7-8krnv" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.588743 4907 scope.go:117] "RemoveContainer" containerID="289d1d6822408120efa26bdf6670d03e6d1d5d6da01a556794102056704114d8" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.591911 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"07559e4d-3526-441a-a08d-e11c60e80761","Type":"ContainerStarted","Data":"6fa9527e8179f3b0ba540e895dcb4d591f7dd7d8ba60bb61cb8a672de6deed94"} Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.592639 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.603331 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63f606f9-1313-4d39-8f54-78078cbd256e","Type":"ContainerStarted","Data":"6c6bc7e41db345a2d9ec04624e21220f811e03062035a4f8cc56ebe8cb743128"} Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.604393 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.609945 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" event={"ID":"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3","Type":"ContainerStarted","Data":"5218bdc69a0f10e5f563f9857aae9548accffec13577a3b0290192d1f219d2dc"} Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.614417 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54546bdf79-77p2l" event={"ID":"2ea22f3e-15c5-4f6e-9269-1da424d29342","Type":"ContainerDied","Data":"6997f44751d0223ce751251fb0e639903ab1fd2ffcfae0430743d397842a4e59"} Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.614467 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54546bdf79-77p2l" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.633624 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.633584008 podStartE2EDuration="40.633584008s" podCreationTimestamp="2025-11-29 14:55:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:56:00.617732142 +0000 UTC m=+1658.604569794" watchObservedRunningTime="2025-11-29 14:56:00.633584008 +0000 UTC m=+1658.620421660" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.654025 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.654002043 podStartE2EDuration="40.654002043s" podCreationTimestamp="2025-11-29 14:55:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 14:56:00.640925305 +0000 UTC m=+1658.627762967" watchObservedRunningTime="2025-11-29 14:56:00.654002043 +0000 UTC m=+1658.640839715" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.660496 4907 scope.go:117] "RemoveContainer" containerID="35e7b30e09c58a06501159656c980ee970d673c53b74f12ff1639fc6aa15e539" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.685069 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-9f67f96b7-8krnv"] Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.696420 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-9f67f96b7-8krnv"] Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.732190 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" podStartSLOduration=3.129512462 podStartE2EDuration="14.732165902s" podCreationTimestamp="2025-11-29 14:55:46 +0000 UTC" firstStartedPulling="2025-11-29 14:55:48.040185643 +0000 UTC m=+1646.027023335" lastFinishedPulling="2025-11-29 14:55:59.642839123 +0000 UTC m=+1657.629676775" observedRunningTime="2025-11-29 14:56:00.685721115 +0000 UTC m=+1658.672558767" watchObservedRunningTime="2025-11-29 14:56:00.732165902 +0000 UTC m=+1658.719003554" Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.760208 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-54546bdf79-77p2l"] Nov 29 14:56:00 crc kubenswrapper[4907]: I1129 14:56:00.781796 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-54546bdf79-77p2l"] Nov 29 14:56:02 crc kubenswrapper[4907]: I1129 14:56:02.499680 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea22f3e-15c5-4f6e-9269-1da424d29342" path="/var/lib/kubelet/pods/2ea22f3e-15c5-4f6e-9269-1da424d29342/volumes" Nov 29 14:56:02 crc kubenswrapper[4907]: I1129 14:56:02.500830 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62227625-11c5-4d0a-b990-a1995069e259" path="/var/lib/kubelet/pods/62227625-11c5-4d0a-b990-a1995069e259/volumes" Nov 29 14:56:03 crc kubenswrapper[4907]: I1129 14:56:03.480232 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:56:03 crc kubenswrapper[4907]: E1129 14:56:03.480894 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:56:05 crc kubenswrapper[4907]: I1129 14:56:05.985518 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-47w2x"] Nov 29 14:56:05 crc kubenswrapper[4907]: I1129 14:56:05.997776 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-47w2x"] Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.100770 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-rqzth"] Nov 29 14:56:06 crc kubenswrapper[4907]: E1129 14:56:06.101261 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea22f3e-15c5-4f6e-9269-1da424d29342" containerName="heat-api" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.101283 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea22f3e-15c5-4f6e-9269-1da424d29342" containerName="heat-api" Nov 29 14:56:06 crc kubenswrapper[4907]: E1129 14:56:06.101316 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62227625-11c5-4d0a-b990-a1995069e259" containerName="heat-cfnapi" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.101323 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="62227625-11c5-4d0a-b990-a1995069e259" containerName="heat-cfnapi" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.101577 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea22f3e-15c5-4f6e-9269-1da424d29342" containerName="heat-api" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.101604 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="62227625-11c5-4d0a-b990-a1995069e259" containerName="heat-cfnapi" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.102452 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.106153 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.126705 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-rqzth"] Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.157643 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-config-data\") pod \"aodh-db-sync-rqzth\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.157704 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-scripts\") pod \"aodh-db-sync-rqzth\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.157786 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xq2v\" (UniqueName: \"kubernetes.io/projected/a59cb670-d91b-41d5-a529-305c35b1bd12-kube-api-access-2xq2v\") pod \"aodh-db-sync-rqzth\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.157830 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-combined-ca-bundle\") pod \"aodh-db-sync-rqzth\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.260344 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-config-data\") pod \"aodh-db-sync-rqzth\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.260409 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-scripts\") pod \"aodh-db-sync-rqzth\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.260493 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xq2v\" (UniqueName: \"kubernetes.io/projected/a59cb670-d91b-41d5-a529-305c35b1bd12-kube-api-access-2xq2v\") pod \"aodh-db-sync-rqzth\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.260539 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-combined-ca-bundle\") pod \"aodh-db-sync-rqzth\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.265909 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-scripts\") pod \"aodh-db-sync-rqzth\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.268060 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-combined-ca-bundle\") pod \"aodh-db-sync-rqzth\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.288088 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-config-data\") pod \"aodh-db-sync-rqzth\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.316142 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xq2v\" (UniqueName: \"kubernetes.io/projected/a59cb670-d91b-41d5-a529-305c35b1bd12-kube-api-access-2xq2v\") pod \"aodh-db-sync-rqzth\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.431234 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:06 crc kubenswrapper[4907]: I1129 14:56:06.492712 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22a1f021-2c7f-497d-9f2e-f64cefe8822d" path="/var/lib/kubelet/pods/22a1f021-2c7f-497d-9f2e-f64cefe8822d/volumes" Nov 29 14:56:07 crc kubenswrapper[4907]: I1129 14:56:07.334871 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-rqzth"] Nov 29 14:56:07 crc kubenswrapper[4907]: W1129 14:56:07.343364 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda59cb670_d91b_41d5_a529_305c35b1bd12.slice/crio-3b61e419bdb9c76826a524750d0fdf7c04fe09f9f9978a44babe8ce5331d3664 WatchSource:0}: Error finding container 3b61e419bdb9c76826a524750d0fdf7c04fe09f9f9978a44babe8ce5331d3664: Status 404 returned error can't find the container with id 3b61e419bdb9c76826a524750d0fdf7c04fe09f9f9978a44babe8ce5331d3664 Nov 29 14:56:07 crc kubenswrapper[4907]: I1129 14:56:07.723097 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rqzth" event={"ID":"a59cb670-d91b-41d5-a529-305c35b1bd12","Type":"ContainerStarted","Data":"3b61e419bdb9c76826a524750d0fdf7c04fe09f9f9978a44babe8ce5331d3664"} Nov 29 14:56:09 crc kubenswrapper[4907]: E1129 14:56:09.264132 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="297d0cd5ee3cbab6d0c630d62813378481056594efe1d066186f89706e7980ac" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Nov 29 14:56:09 crc kubenswrapper[4907]: E1129 14:56:09.265940 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="297d0cd5ee3cbab6d0c630d62813378481056594efe1d066186f89706e7980ac" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Nov 29 14:56:09 crc kubenswrapper[4907]: E1129 14:56:09.267449 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="297d0cd5ee3cbab6d0c630d62813378481056594efe1d066186f89706e7980ac" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Nov 29 14:56:09 crc kubenswrapper[4907]: E1129 14:56:09.267487 4907 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-cdc66dd5-7pf8b" podUID="08229553-e114-45c9-a109-b01223241912" containerName="heat-engine" Nov 29 14:56:10 crc kubenswrapper[4907]: I1129 14:56:10.588668 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 29 14:56:10 crc kubenswrapper[4907]: I1129 14:56:10.617651 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 29 14:56:12 crc kubenswrapper[4907]: I1129 14:56:12.794455 4907 generic.go:334] "Generic (PLEG): container finished" podID="08229553-e114-45c9-a109-b01223241912" containerID="297d0cd5ee3cbab6d0c630d62813378481056594efe1d066186f89706e7980ac" exitCode=0 Nov 29 14:56:12 crc kubenswrapper[4907]: I1129 14:56:12.794724 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-cdc66dd5-7pf8b" event={"ID":"08229553-e114-45c9-a109-b01223241912","Type":"ContainerDied","Data":"297d0cd5ee3cbab6d0c630d62813378481056594efe1d066186f89706e7980ac"} Nov 29 14:56:12 crc kubenswrapper[4907]: I1129 14:56:12.797772 4907 generic.go:334] "Generic (PLEG): container finished" podID="6af6aa2c-2dee-441f-8607-cd1aec4d6fc3" containerID="5218bdc69a0f10e5f563f9857aae9548accffec13577a3b0290192d1f219d2dc" exitCode=0 Nov 29 14:56:12 crc kubenswrapper[4907]: I1129 14:56:12.797805 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" event={"ID":"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3","Type":"ContainerDied","Data":"5218bdc69a0f10e5f563f9857aae9548accffec13577a3b0290192d1f219d2dc"} Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.298834 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.393912 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.412555 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-config-data-custom\") pod \"08229553-e114-45c9-a109-b01223241912\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.412633 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-combined-ca-bundle\") pod \"08229553-e114-45c9-a109-b01223241912\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.412728 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbxmp\" (UniqueName: \"kubernetes.io/projected/08229553-e114-45c9-a109-b01223241912-kube-api-access-mbxmp\") pod \"08229553-e114-45c9-a109-b01223241912\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.412868 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-config-data\") pod \"08229553-e114-45c9-a109-b01223241912\" (UID: \"08229553-e114-45c9-a109-b01223241912\") " Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.420645 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08229553-e114-45c9-a109-b01223241912-kube-api-access-mbxmp" (OuterVolumeSpecName: "kube-api-access-mbxmp") pod "08229553-e114-45c9-a109-b01223241912" (UID: "08229553-e114-45c9-a109-b01223241912"). InnerVolumeSpecName "kube-api-access-mbxmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.420947 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "08229553-e114-45c9-a109-b01223241912" (UID: "08229553-e114-45c9-a109-b01223241912"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.454402 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08229553-e114-45c9-a109-b01223241912" (UID: "08229553-e114-45c9-a109-b01223241912"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.489079 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-config-data" (OuterVolumeSpecName: "config-data") pod "08229553-e114-45c9-a109-b01223241912" (UID: "08229553-e114-45c9-a109-b01223241912"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.514923 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-inventory\") pod \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.515280 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm5x2\" (UniqueName: \"kubernetes.io/projected/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-kube-api-access-mm5x2\") pod \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.515512 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-repo-setup-combined-ca-bundle\") pod \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.515575 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-ssh-key\") pod \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\" (UID: \"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3\") " Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.516231 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.516248 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.516259 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbxmp\" (UniqueName: \"kubernetes.io/projected/08229553-e114-45c9-a109-b01223241912-kube-api-access-mbxmp\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.516270 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08229553-e114-45c9-a109-b01223241912-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.519447 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6af6aa2c-2dee-441f-8607-cd1aec4d6fc3" (UID: "6af6aa2c-2dee-441f-8607-cd1aec4d6fc3"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.519936 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-kube-api-access-mm5x2" (OuterVolumeSpecName: "kube-api-access-mm5x2") pod "6af6aa2c-2dee-441f-8607-cd1aec4d6fc3" (UID: "6af6aa2c-2dee-441f-8607-cd1aec4d6fc3"). InnerVolumeSpecName "kube-api-access-mm5x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.561255 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-inventory" (OuterVolumeSpecName: "inventory") pod "6af6aa2c-2dee-441f-8607-cd1aec4d6fc3" (UID: "6af6aa2c-2dee-441f-8607-cd1aec4d6fc3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.562125 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6af6aa2c-2dee-441f-8607-cd1aec4d6fc3" (UID: "6af6aa2c-2dee-441f-8607-cd1aec4d6fc3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.618526 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.618558 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm5x2\" (UniqueName: \"kubernetes.io/projected/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-kube-api-access-mm5x2\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.618570 4907 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.618580 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6af6aa2c-2dee-441f-8607-cd1aec4d6fc3-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.833743 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" event={"ID":"6af6aa2c-2dee-441f-8607-cd1aec4d6fc3","Type":"ContainerDied","Data":"8281ef03dc0ed23bcf232f2602d62f349461652ca0dd071d998f1b00eaea156b"} Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.833891 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8281ef03dc0ed23bcf232f2602d62f349461652ca0dd071d998f1b00eaea156b" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.834040 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.849211 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rqzth" event={"ID":"a59cb670-d91b-41d5-a529-305c35b1bd12","Type":"ContainerStarted","Data":"a814bf3a9083fc8e446abdf6f7c2e0a65168cbffe17329514089e65fa26384a8"} Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.853154 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-cdc66dd5-7pf8b" event={"ID":"08229553-e114-45c9-a109-b01223241912","Type":"ContainerDied","Data":"d596db3276e7955a15284e266b1a29e22a79ae44a6e5858fefd5ef09c31529a0"} Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.853356 4907 scope.go:117] "RemoveContainer" containerID="297d0cd5ee3cbab6d0c630d62813378481056594efe1d066186f89706e7980ac" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.853620 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-cdc66dd5-7pf8b" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.887406 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-rqzth" podStartSLOduration=2.351372429 podStartE2EDuration="8.887389992s" podCreationTimestamp="2025-11-29 14:56:06 +0000 UTC" firstStartedPulling="2025-11-29 14:56:07.345670803 +0000 UTC m=+1665.332508455" lastFinishedPulling="2025-11-29 14:56:13.881688366 +0000 UTC m=+1671.868526018" observedRunningTime="2025-11-29 14:56:14.876608889 +0000 UTC m=+1672.863446541" watchObservedRunningTime="2025-11-29 14:56:14.887389992 +0000 UTC m=+1672.874227644" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.932693 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-cdc66dd5-7pf8b"] Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.963824 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p"] Nov 29 14:56:14 crc kubenswrapper[4907]: E1129 14:56:14.966135 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af6aa2c-2dee-441f-8607-cd1aec4d6fc3" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.966159 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af6aa2c-2dee-441f-8607-cd1aec4d6fc3" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 29 14:56:14 crc kubenswrapper[4907]: E1129 14:56:14.966215 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08229553-e114-45c9-a109-b01223241912" containerName="heat-engine" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.966224 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="08229553-e114-45c9-a109-b01223241912" containerName="heat-engine" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.967190 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="08229553-e114-45c9-a109-b01223241912" containerName="heat-engine" Nov 29 14:56:14 crc kubenswrapper[4907]: I1129 14:56:14.967224 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af6aa2c-2dee-441f-8607-cd1aec4d6fc3" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.023281 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.033201 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36c11dee-b610-4a94-937c-63b049c54f14-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vsb6p\" (UID: \"36c11dee-b610-4a94-937c-63b049c54f14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.033273 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36c11dee-b610-4a94-937c-63b049c54f14-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vsb6p\" (UID: \"36c11dee-b610-4a94-937c-63b049c54f14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.033393 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44ncq\" (UniqueName: \"kubernetes.io/projected/36c11dee-b610-4a94-937c-63b049c54f14-kube-api-access-44ncq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vsb6p\" (UID: \"36c11dee-b610-4a94-937c-63b049c54f14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.047856 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-cdc66dd5-7pf8b"] Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.051986 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.061579 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p"] Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.062024 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.062364 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.062500 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.136793 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44ncq\" (UniqueName: \"kubernetes.io/projected/36c11dee-b610-4a94-937c-63b049c54f14-kube-api-access-44ncq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vsb6p\" (UID: \"36c11dee-b610-4a94-937c-63b049c54f14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.136904 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36c11dee-b610-4a94-937c-63b049c54f14-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vsb6p\" (UID: \"36c11dee-b610-4a94-937c-63b049c54f14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.136952 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36c11dee-b610-4a94-937c-63b049c54f14-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vsb6p\" (UID: \"36c11dee-b610-4a94-937c-63b049c54f14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.148177 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36c11dee-b610-4a94-937c-63b049c54f14-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vsb6p\" (UID: \"36c11dee-b610-4a94-937c-63b049c54f14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.165023 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36c11dee-b610-4a94-937c-63b049c54f14-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vsb6p\" (UID: \"36c11dee-b610-4a94-937c-63b049c54f14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.178163 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44ncq\" (UniqueName: \"kubernetes.io/projected/36c11dee-b610-4a94-937c-63b049c54f14-kube-api-access-44ncq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-vsb6p\" (UID: \"36c11dee-b610-4a94-937c-63b049c54f14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.345951 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" Nov 29 14:56:15 crc kubenswrapper[4907]: I1129 14:56:15.896370 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p"] Nov 29 14:56:15 crc kubenswrapper[4907]: W1129 14:56:15.904887 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36c11dee_b610_4a94_937c_63b049c54f14.slice/crio-b009036f849a110fa6ad516441baf9b8667b104673a9fb8d004a4e8f8fbe9890 WatchSource:0}: Error finding container b009036f849a110fa6ad516441baf9b8667b104673a9fb8d004a4e8f8fbe9890: Status 404 returned error can't find the container with id b009036f849a110fa6ad516441baf9b8667b104673a9fb8d004a4e8f8fbe9890 Nov 29 14:56:16 crc kubenswrapper[4907]: I1129 14:56:16.480028 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:56:16 crc kubenswrapper[4907]: E1129 14:56:16.480786 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:56:16 crc kubenswrapper[4907]: I1129 14:56:16.498665 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08229553-e114-45c9-a109-b01223241912" path="/var/lib/kubelet/pods/08229553-e114-45c9-a109-b01223241912/volumes" Nov 29 14:56:16 crc kubenswrapper[4907]: I1129 14:56:16.893013 4907 generic.go:334] "Generic (PLEG): container finished" podID="a59cb670-d91b-41d5-a529-305c35b1bd12" containerID="a814bf3a9083fc8e446abdf6f7c2e0a65168cbffe17329514089e65fa26384a8" exitCode=0 Nov 29 14:56:16 crc kubenswrapper[4907]: I1129 14:56:16.893109 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rqzth" event={"ID":"a59cb670-d91b-41d5-a529-305c35b1bd12","Type":"ContainerDied","Data":"a814bf3a9083fc8e446abdf6f7c2e0a65168cbffe17329514089e65fa26384a8"} Nov 29 14:56:16 crc kubenswrapper[4907]: I1129 14:56:16.894765 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" event={"ID":"36c11dee-b610-4a94-937c-63b049c54f14","Type":"ContainerStarted","Data":"b53cfe9de5fd74c701b4abc9467b709a15ccd8cc25def0401584648a8ca89c56"} Nov 29 14:56:16 crc kubenswrapper[4907]: I1129 14:56:16.894801 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" event={"ID":"36c11dee-b610-4a94-937c-63b049c54f14","Type":"ContainerStarted","Data":"b009036f849a110fa6ad516441baf9b8667b104673a9fb8d004a4e8f8fbe9890"} Nov 29 14:56:16 crc kubenswrapper[4907]: I1129 14:56:16.934143 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" podStartSLOduration=2.506315004 podStartE2EDuration="2.934123951s" podCreationTimestamp="2025-11-29 14:56:14 +0000 UTC" firstStartedPulling="2025-11-29 14:56:15.910205711 +0000 UTC m=+1673.897043363" lastFinishedPulling="2025-11-29 14:56:16.338014658 +0000 UTC m=+1674.324852310" observedRunningTime="2025-11-29 14:56:16.929404008 +0000 UTC m=+1674.916241660" watchObservedRunningTime="2025-11-29 14:56:16.934123951 +0000 UTC m=+1674.920961603" Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.403918 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.439068 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-scripts\") pod \"a59cb670-d91b-41d5-a529-305c35b1bd12\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.439180 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-config-data\") pod \"a59cb670-d91b-41d5-a529-305c35b1bd12\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.439328 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-combined-ca-bundle\") pod \"a59cb670-d91b-41d5-a529-305c35b1bd12\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.439559 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xq2v\" (UniqueName: \"kubernetes.io/projected/a59cb670-d91b-41d5-a529-305c35b1bd12-kube-api-access-2xq2v\") pod \"a59cb670-d91b-41d5-a529-305c35b1bd12\" (UID: \"a59cb670-d91b-41d5-a529-305c35b1bd12\") " Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.446664 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-scripts" (OuterVolumeSpecName: "scripts") pod "a59cb670-d91b-41d5-a529-305c35b1bd12" (UID: "a59cb670-d91b-41d5-a529-305c35b1bd12"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.447616 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a59cb670-d91b-41d5-a529-305c35b1bd12-kube-api-access-2xq2v" (OuterVolumeSpecName: "kube-api-access-2xq2v") pod "a59cb670-d91b-41d5-a529-305c35b1bd12" (UID: "a59cb670-d91b-41d5-a529-305c35b1bd12"). InnerVolumeSpecName "kube-api-access-2xq2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.476178 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a59cb670-d91b-41d5-a529-305c35b1bd12" (UID: "a59cb670-d91b-41d5-a529-305c35b1bd12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.477978 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-config-data" (OuterVolumeSpecName: "config-data") pod "a59cb670-d91b-41d5-a529-305c35b1bd12" (UID: "a59cb670-d91b-41d5-a529-305c35b1bd12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.541979 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.542131 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xq2v\" (UniqueName: \"kubernetes.io/projected/a59cb670-d91b-41d5-a529-305c35b1bd12-kube-api-access-2xq2v\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.542171 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.542186 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a59cb670-d91b-41d5-a529-305c35b1bd12-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.933141 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rqzth" event={"ID":"a59cb670-d91b-41d5-a529-305c35b1bd12","Type":"ContainerDied","Data":"3b61e419bdb9c76826a524750d0fdf7c04fe09f9f9978a44babe8ce5331d3664"} Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.933577 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b61e419bdb9c76826a524750d0fdf7c04fe09f9f9978a44babe8ce5331d3664" Nov 29 14:56:18 crc kubenswrapper[4907]: I1129 14:56:18.933217 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rqzth" Nov 29 14:56:19 crc kubenswrapper[4907]: I1129 14:56:19.952394 4907 generic.go:334] "Generic (PLEG): container finished" podID="36c11dee-b610-4a94-937c-63b049c54f14" containerID="b53cfe9de5fd74c701b4abc9467b709a15ccd8cc25def0401584648a8ca89c56" exitCode=0 Nov 29 14:56:19 crc kubenswrapper[4907]: I1129 14:56:19.952494 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" event={"ID":"36c11dee-b610-4a94-937c-63b049c54f14","Type":"ContainerDied","Data":"b53cfe9de5fd74c701b4abc9467b709a15ccd8cc25def0401584648a8ca89c56"} Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.148181 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.148789 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-api" containerID="cri-o://c81b19aac835c840d324b3961a7b59ad255930d8a67a01bb9b75945d77cfbf4c" gracePeriod=30 Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.149245 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-listener" containerID="cri-o://94a71fea1213343a7a067f53646d435c8e0e7770ab67d13a6d620564f004b446" gracePeriod=30 Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.149289 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-notifier" containerID="cri-o://03b493e8e697ff13a13c62970b81ca8b601cfb98dc11ede4f617cbf54546b31f" gracePeriod=30 Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.149322 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-evaluator" containerID="cri-o://6bad483b902c5fb5512c2928e012996114decb1d42421e75d2632c700f8acfe7" gracePeriod=30 Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.488792 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.535948 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36c11dee-b610-4a94-937c-63b049c54f14-inventory\") pod \"36c11dee-b610-4a94-937c-63b049c54f14\" (UID: \"36c11dee-b610-4a94-937c-63b049c54f14\") " Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.536004 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44ncq\" (UniqueName: \"kubernetes.io/projected/36c11dee-b610-4a94-937c-63b049c54f14-kube-api-access-44ncq\") pod \"36c11dee-b610-4a94-937c-63b049c54f14\" (UID: \"36c11dee-b610-4a94-937c-63b049c54f14\") " Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.536278 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36c11dee-b610-4a94-937c-63b049c54f14-ssh-key\") pod \"36c11dee-b610-4a94-937c-63b049c54f14\" (UID: \"36c11dee-b610-4a94-937c-63b049c54f14\") " Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.546735 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c11dee-b610-4a94-937c-63b049c54f14-kube-api-access-44ncq" (OuterVolumeSpecName: "kube-api-access-44ncq") pod "36c11dee-b610-4a94-937c-63b049c54f14" (UID: "36c11dee-b610-4a94-937c-63b049c54f14"). InnerVolumeSpecName "kube-api-access-44ncq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.569145 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c11dee-b610-4a94-937c-63b049c54f14-inventory" (OuterVolumeSpecName: "inventory") pod "36c11dee-b610-4a94-937c-63b049c54f14" (UID: "36c11dee-b610-4a94-937c-63b049c54f14"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.580879 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c11dee-b610-4a94-937c-63b049c54f14-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "36c11dee-b610-4a94-937c-63b049c54f14" (UID: "36c11dee-b610-4a94-937c-63b049c54f14"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.639525 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36c11dee-b610-4a94-937c-63b049c54f14-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.639561 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44ncq\" (UniqueName: \"kubernetes.io/projected/36c11dee-b610-4a94-937c-63b049c54f14-kube-api-access-44ncq\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.639624 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36c11dee-b610-4a94-937c-63b049c54f14-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.986088 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" event={"ID":"36c11dee-b610-4a94-937c-63b049c54f14","Type":"ContainerDied","Data":"b009036f849a110fa6ad516441baf9b8667b104673a9fb8d004a4e8f8fbe9890"} Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.986148 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b009036f849a110fa6ad516441baf9b8667b104673a9fb8d004a4e8f8fbe9890" Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.986231 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-vsb6p" Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.993516 4907 generic.go:334] "Generic (PLEG): container finished" podID="10c5cb11-136f-44a4-a616-39ed1723237e" containerID="6bad483b902c5fb5512c2928e012996114decb1d42421e75d2632c700f8acfe7" exitCode=0 Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.993755 4907 generic.go:334] "Generic (PLEG): container finished" podID="10c5cb11-136f-44a4-a616-39ed1723237e" containerID="c81b19aac835c840d324b3961a7b59ad255930d8a67a01bb9b75945d77cfbf4c" exitCode=0 Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.993915 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"10c5cb11-136f-44a4-a616-39ed1723237e","Type":"ContainerDied","Data":"6bad483b902c5fb5512c2928e012996114decb1d42421e75d2632c700f8acfe7"} Nov 29 14:56:21 crc kubenswrapper[4907]: I1129 14:56:21.994113 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"10c5cb11-136f-44a4-a616-39ed1723237e","Type":"ContainerDied","Data":"c81b19aac835c840d324b3961a7b59ad255930d8a67a01bb9b75945d77cfbf4c"} Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.086886 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz"] Nov 29 14:56:22 crc kubenswrapper[4907]: E1129 14:56:22.087900 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59cb670-d91b-41d5-a529-305c35b1bd12" containerName="aodh-db-sync" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.087929 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59cb670-d91b-41d5-a529-305c35b1bd12" containerName="aodh-db-sync" Nov 29 14:56:22 crc kubenswrapper[4907]: E1129 14:56:22.088028 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c11dee-b610-4a94-937c-63b049c54f14" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.088042 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c11dee-b610-4a94-937c-63b049c54f14" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.088340 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59cb670-d91b-41d5-a529-305c35b1bd12" containerName="aodh-db-sync" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.088365 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c11dee-b610-4a94-937c-63b049c54f14" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.089385 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.097750 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.097771 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.097821 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.097930 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.112100 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz"] Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.157548 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fbkv\" (UniqueName: \"kubernetes.io/projected/527f58ea-f7a3-43c4-aeee-b22f40560466-kube-api-access-8fbkv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.157696 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.157937 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.158031 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.260829 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.260902 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.260933 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.261027 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fbkv\" (UniqueName: \"kubernetes.io/projected/527f58ea-f7a3-43c4-aeee-b22f40560466-kube-api-access-8fbkv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.265571 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.267080 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.271929 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.293813 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fbkv\" (UniqueName: \"kubernetes.io/projected/527f58ea-f7a3-43c4-aeee-b22f40560466-kube-api-access-8fbkv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.417131 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.425399 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:56:22 crc kubenswrapper[4907]: I1129 14:56:22.992258 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz"] Nov 29 14:56:23 crc kubenswrapper[4907]: I1129 14:56:23.005688 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" event={"ID":"527f58ea-f7a3-43c4-aeee-b22f40560466","Type":"ContainerStarted","Data":"f910150124cde1c9aba437d0019aab862145dc363f6344811c8bcec83e3cd221"} Nov 29 14:56:23 crc kubenswrapper[4907]: I1129 14:56:23.388244 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.021540 4907 generic.go:334] "Generic (PLEG): container finished" podID="10c5cb11-136f-44a4-a616-39ed1723237e" containerID="94a71fea1213343a7a067f53646d435c8e0e7770ab67d13a6d620564f004b446" exitCode=0 Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.021797 4907 generic.go:334] "Generic (PLEG): container finished" podID="10c5cb11-136f-44a4-a616-39ed1723237e" containerID="03b493e8e697ff13a13c62970b81ca8b601cfb98dc11ede4f617cbf54546b31f" exitCode=0 Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.021618 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"10c5cb11-136f-44a4-a616-39ed1723237e","Type":"ContainerDied","Data":"94a71fea1213343a7a067f53646d435c8e0e7770ab67d13a6d620564f004b446"} Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.021871 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"10c5cb11-136f-44a4-a616-39ed1723237e","Type":"ContainerDied","Data":"03b493e8e697ff13a13c62970b81ca8b601cfb98dc11ede4f617cbf54546b31f"} Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.024979 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" event={"ID":"527f58ea-f7a3-43c4-aeee-b22f40560466","Type":"ContainerStarted","Data":"38f48d5b9622eb460c7d28d5399be5731521c51813c1b224752f124e3d894fd9"} Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.063027 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" podStartSLOduration=1.6631317220000001 podStartE2EDuration="2.063003423s" podCreationTimestamp="2025-11-29 14:56:22 +0000 UTC" firstStartedPulling="2025-11-29 14:56:22.985106564 +0000 UTC m=+1680.971944216" lastFinishedPulling="2025-11-29 14:56:23.384978235 +0000 UTC m=+1681.371815917" observedRunningTime="2025-11-29 14:56:24.045101579 +0000 UTC m=+1682.031939231" watchObservedRunningTime="2025-11-29 14:56:24.063003423 +0000 UTC m=+1682.049841085" Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.124131 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.241518 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-internal-tls-certs\") pod \"10c5cb11-136f-44a4-a616-39ed1723237e\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.241567 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvlqq\" (UniqueName: \"kubernetes.io/projected/10c5cb11-136f-44a4-a616-39ed1723237e-kube-api-access-rvlqq\") pod \"10c5cb11-136f-44a4-a616-39ed1723237e\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.241671 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-public-tls-certs\") pod \"10c5cb11-136f-44a4-a616-39ed1723237e\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.241778 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-config-data\") pod \"10c5cb11-136f-44a4-a616-39ed1723237e\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.241811 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-combined-ca-bundle\") pod \"10c5cb11-136f-44a4-a616-39ed1723237e\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.241835 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-scripts\") pod \"10c5cb11-136f-44a4-a616-39ed1723237e\" (UID: \"10c5cb11-136f-44a4-a616-39ed1723237e\") " Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.251673 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c5cb11-136f-44a4-a616-39ed1723237e-kube-api-access-rvlqq" (OuterVolumeSpecName: "kube-api-access-rvlqq") pod "10c5cb11-136f-44a4-a616-39ed1723237e" (UID: "10c5cb11-136f-44a4-a616-39ed1723237e"). InnerVolumeSpecName "kube-api-access-rvlqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.251927 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-scripts" (OuterVolumeSpecName: "scripts") pod "10c5cb11-136f-44a4-a616-39ed1723237e" (UID: "10c5cb11-136f-44a4-a616-39ed1723237e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.319296 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "10c5cb11-136f-44a4-a616-39ed1723237e" (UID: "10c5cb11-136f-44a4-a616-39ed1723237e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.330961 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "10c5cb11-136f-44a4-a616-39ed1723237e" (UID: "10c5cb11-136f-44a4-a616-39ed1723237e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.344110 4907 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.344144 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvlqq\" (UniqueName: \"kubernetes.io/projected/10c5cb11-136f-44a4-a616-39ed1723237e-kube-api-access-rvlqq\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.344155 4907 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.344164 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.389404 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-config-data" (OuterVolumeSpecName: "config-data") pod "10c5cb11-136f-44a4-a616-39ed1723237e" (UID: "10c5cb11-136f-44a4-a616-39ed1723237e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.453040 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.466068 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10c5cb11-136f-44a4-a616-39ed1723237e" (UID: "10c5cb11-136f-44a4-a616-39ed1723237e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:56:24 crc kubenswrapper[4907]: I1129 14:56:24.558709 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c5cb11-136f-44a4-a616-39ed1723237e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.042027 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.042112 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"10c5cb11-136f-44a4-a616-39ed1723237e","Type":"ContainerDied","Data":"8a5a07cf25175044421e134e9c60ba32ca0e06c4d1711a2aa427248fd2017799"} Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.043135 4907 scope.go:117] "RemoveContainer" containerID="94a71fea1213343a7a067f53646d435c8e0e7770ab67d13a6d620564f004b446" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.089337 4907 scope.go:117] "RemoveContainer" containerID="03b493e8e697ff13a13c62970b81ca8b601cfb98dc11ede4f617cbf54546b31f" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.095605 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.120638 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.124312 4907 scope.go:117] "RemoveContainer" containerID="6bad483b902c5fb5512c2928e012996114decb1d42421e75d2632c700f8acfe7" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.138829 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Nov 29 14:56:25 crc kubenswrapper[4907]: E1129 14:56:25.139498 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-listener" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.139513 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-listener" Nov 29 14:56:25 crc kubenswrapper[4907]: E1129 14:56:25.139536 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-notifier" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.139542 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-notifier" Nov 29 14:56:25 crc kubenswrapper[4907]: E1129 14:56:25.139558 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-api" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.139564 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-api" Nov 29 14:56:25 crc kubenswrapper[4907]: E1129 14:56:25.139592 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-evaluator" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.139598 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-evaluator" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.139803 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-api" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.139826 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-evaluator" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.139947 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-listener" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.139979 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" containerName="aodh-notifier" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.142141 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.144781 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.144996 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-b88t7" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.145168 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.145329 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.145533 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.150910 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.172751 4907 scope.go:117] "RemoveContainer" containerID="c81b19aac835c840d324b3961a7b59ad255930d8a67a01bb9b75945d77cfbf4c" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.310179 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ceb3061f-2f32-4e89-98f0-628f316bef79-scripts\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.310252 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk2w8\" (UniqueName: \"kubernetes.io/projected/ceb3061f-2f32-4e89-98f0-628f316bef79-kube-api-access-wk2w8\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.310274 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceb3061f-2f32-4e89-98f0-628f316bef79-public-tls-certs\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.310293 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceb3061f-2f32-4e89-98f0-628f316bef79-internal-tls-certs\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.310357 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb3061f-2f32-4e89-98f0-628f316bef79-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.310394 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb3061f-2f32-4e89-98f0-628f316bef79-config-data\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.412410 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ceb3061f-2f32-4e89-98f0-628f316bef79-scripts\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.412482 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk2w8\" (UniqueName: \"kubernetes.io/projected/ceb3061f-2f32-4e89-98f0-628f316bef79-kube-api-access-wk2w8\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.412505 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceb3061f-2f32-4e89-98f0-628f316bef79-public-tls-certs\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.412524 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceb3061f-2f32-4e89-98f0-628f316bef79-internal-tls-certs\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.412561 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb3061f-2f32-4e89-98f0-628f316bef79-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.412597 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb3061f-2f32-4e89-98f0-628f316bef79-config-data\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.427084 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceb3061f-2f32-4e89-98f0-628f316bef79-public-tls-certs\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.427292 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb3061f-2f32-4e89-98f0-628f316bef79-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.427722 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceb3061f-2f32-4e89-98f0-628f316bef79-internal-tls-certs\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.430022 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ceb3061f-2f32-4e89-98f0-628f316bef79-scripts\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.434591 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb3061f-2f32-4e89-98f0-628f316bef79-config-data\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.435049 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk2w8\" (UniqueName: \"kubernetes.io/projected/ceb3061f-2f32-4e89-98f0-628f316bef79-kube-api-access-wk2w8\") pod \"aodh-0\" (UID: \"ceb3061f-2f32-4e89-98f0-628f316bef79\") " pod="openstack/aodh-0" Nov 29 14:56:25 crc kubenswrapper[4907]: I1129 14:56:25.469635 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 29 14:56:26 crc kubenswrapper[4907]: I1129 14:56:26.042825 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 29 14:56:26 crc kubenswrapper[4907]: I1129 14:56:26.053657 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ceb3061f-2f32-4e89-98f0-628f316bef79","Type":"ContainerStarted","Data":"a1bfa6f0c32f8ecd460d0e31cbde18f9a330936a7ddbf4c37cf1c40eb0a65b52"} Nov 29 14:56:26 crc kubenswrapper[4907]: I1129 14:56:26.494018 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c5cb11-136f-44a4-a616-39ed1723237e" path="/var/lib/kubelet/pods/10c5cb11-136f-44a4-a616-39ed1723237e/volumes" Nov 29 14:56:27 crc kubenswrapper[4907]: I1129 14:56:27.068555 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ceb3061f-2f32-4e89-98f0-628f316bef79","Type":"ContainerStarted","Data":"fa9ce5df9381d66055a1b386d14261799c8b8f97d1be243566aecfc83675cccb"} Nov 29 14:56:28 crc kubenswrapper[4907]: I1129 14:56:28.085781 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ceb3061f-2f32-4e89-98f0-628f316bef79","Type":"ContainerStarted","Data":"415c9fb5af6c008573f6938ed2c2949ac954267ae9a1467c92edc9cd91f00398"} Nov 29 14:56:29 crc kubenswrapper[4907]: I1129 14:56:29.101520 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ceb3061f-2f32-4e89-98f0-628f316bef79","Type":"ContainerStarted","Data":"7f2a1edc544aaa20308baa32b1843383539fde2440add6b89019250204e8e526"} Nov 29 14:56:30 crc kubenswrapper[4907]: I1129 14:56:30.120127 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ceb3061f-2f32-4e89-98f0-628f316bef79","Type":"ContainerStarted","Data":"b6f7d0751fa1ab59798261d1c7ffb23f966ae6c3b2bd211247bc704b5b933dfd"} Nov 29 14:56:30 crc kubenswrapper[4907]: I1129 14:56:30.160234 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.507265825 podStartE2EDuration="5.160211517s" podCreationTimestamp="2025-11-29 14:56:25 +0000 UTC" firstStartedPulling="2025-11-29 14:56:26.036368606 +0000 UTC m=+1684.023206258" lastFinishedPulling="2025-11-29 14:56:29.689314298 +0000 UTC m=+1687.676151950" observedRunningTime="2025-11-29 14:56:30.149581068 +0000 UTC m=+1688.136418760" watchObservedRunningTime="2025-11-29 14:56:30.160211517 +0000 UTC m=+1688.147049199" Nov 29 14:56:30 crc kubenswrapper[4907]: I1129 14:56:30.479425 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:56:30 crc kubenswrapper[4907]: E1129 14:56:30.479790 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:56:45 crc kubenswrapper[4907]: I1129 14:56:45.480370 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:56:45 crc kubenswrapper[4907]: E1129 14:56:45.481711 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:56:58 crc kubenswrapper[4907]: I1129 14:56:58.480894 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:56:58 crc kubenswrapper[4907]: E1129 14:56:58.481710 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:57:11 crc kubenswrapper[4907]: I1129 14:57:11.479182 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:57:11 crc kubenswrapper[4907]: E1129 14:57:11.480064 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:57:24 crc kubenswrapper[4907]: I1129 14:57:24.480294 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:57:24 crc kubenswrapper[4907]: E1129 14:57:24.481196 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:57:25 crc kubenswrapper[4907]: I1129 14:57:25.584226 4907 scope.go:117] "RemoveContainer" containerID="8fdcdd1b28e6cdbe5dad885dff079712be2bfe4635dd08d1555aa547006ff028" Nov 29 14:57:25 crc kubenswrapper[4907]: I1129 14:57:25.636517 4907 scope.go:117] "RemoveContainer" containerID="7fc160d1dcbba4c1f63142c2b38c370bcd6c3373d098a0ef4b6e8d749d144a31" Nov 29 14:57:25 crc kubenswrapper[4907]: I1129 14:57:25.681409 4907 scope.go:117] "RemoveContainer" containerID="4d567e1378c65706be58e4b02188fc84061792e5e0cb65d989fc6545e0a5f7af" Nov 29 14:57:25 crc kubenswrapper[4907]: I1129 14:57:25.758833 4907 scope.go:117] "RemoveContainer" containerID="ebf036c804778e50de4af4602abd5562148a2b93f8da8c6dd577baf9978c623b" Nov 29 14:57:25 crc kubenswrapper[4907]: I1129 14:57:25.801541 4907 scope.go:117] "RemoveContainer" containerID="cac75ed3a6ff16fc4b44777d7cb5db755c231a7da24efa3c9e860cc0942554b2" Nov 29 14:57:25 crc kubenswrapper[4907]: I1129 14:57:25.866756 4907 scope.go:117] "RemoveContainer" containerID="7e8cbc93d8d2f69d58bda5141aed2f4b91f0365b241ea68c2b3ff9697d99b781" Nov 29 14:57:35 crc kubenswrapper[4907]: I1129 14:57:35.480792 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:57:35 crc kubenswrapper[4907]: E1129 14:57:35.482221 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:57:48 crc kubenswrapper[4907]: I1129 14:57:48.480717 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:57:48 crc kubenswrapper[4907]: E1129 14:57:48.481555 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:58:01 crc kubenswrapper[4907]: I1129 14:58:01.480094 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:58:01 crc kubenswrapper[4907]: E1129 14:58:01.481237 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:58:15 crc kubenswrapper[4907]: I1129 14:58:15.481119 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:58:15 crc kubenswrapper[4907]: E1129 14:58:15.482394 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:58:26 crc kubenswrapper[4907]: I1129 14:58:26.053348 4907 scope.go:117] "RemoveContainer" containerID="cb28a6d071c6ecb64928ca29f3a0ece93cd1195fca05b49b9612aa48dcedc1ed" Nov 29 14:58:29 crc kubenswrapper[4907]: I1129 14:58:29.480105 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:58:29 crc kubenswrapper[4907]: E1129 14:58:29.481263 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:58:41 crc kubenswrapper[4907]: I1129 14:58:41.479430 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:58:41 crc kubenswrapper[4907]: E1129 14:58:41.480229 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:58:55 crc kubenswrapper[4907]: I1129 14:58:55.480236 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:58:55 crc kubenswrapper[4907]: E1129 14:58:55.481458 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:59:06 crc kubenswrapper[4907]: I1129 14:59:06.481042 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:59:06 crc kubenswrapper[4907]: E1129 14:59:06.482658 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:59:21 crc kubenswrapper[4907]: I1129 14:59:21.481045 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:59:21 crc kubenswrapper[4907]: E1129 14:59:21.482741 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:59:35 crc kubenswrapper[4907]: I1129 14:59:35.480645 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:59:35 crc kubenswrapper[4907]: E1129 14:59:35.481633 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 14:59:37 crc kubenswrapper[4907]: I1129 14:59:37.853313 4907 generic.go:334] "Generic (PLEG): container finished" podID="527f58ea-f7a3-43c4-aeee-b22f40560466" containerID="38f48d5b9622eb460c7d28d5399be5731521c51813c1b224752f124e3d894fd9" exitCode=0 Nov 29 14:59:37 crc kubenswrapper[4907]: I1129 14:59:37.853386 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" event={"ID":"527f58ea-f7a3-43c4-aeee-b22f40560466","Type":"ContainerDied","Data":"38f48d5b9622eb460c7d28d5399be5731521c51813c1b224752f124e3d894fd9"} Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.518749 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.610603 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-bootstrap-combined-ca-bundle\") pod \"527f58ea-f7a3-43c4-aeee-b22f40560466\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.610766 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-ssh-key\") pod \"527f58ea-f7a3-43c4-aeee-b22f40560466\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.610881 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fbkv\" (UniqueName: \"kubernetes.io/projected/527f58ea-f7a3-43c4-aeee-b22f40560466-kube-api-access-8fbkv\") pod \"527f58ea-f7a3-43c4-aeee-b22f40560466\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.611132 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-inventory\") pod \"527f58ea-f7a3-43c4-aeee-b22f40560466\" (UID: \"527f58ea-f7a3-43c4-aeee-b22f40560466\") " Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.628385 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527f58ea-f7a3-43c4-aeee-b22f40560466-kube-api-access-8fbkv" (OuterVolumeSpecName: "kube-api-access-8fbkv") pod "527f58ea-f7a3-43c4-aeee-b22f40560466" (UID: "527f58ea-f7a3-43c4-aeee-b22f40560466"). InnerVolumeSpecName "kube-api-access-8fbkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.628617 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "527f58ea-f7a3-43c4-aeee-b22f40560466" (UID: "527f58ea-f7a3-43c4-aeee-b22f40560466"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.657671 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "527f58ea-f7a3-43c4-aeee-b22f40560466" (UID: "527f58ea-f7a3-43c4-aeee-b22f40560466"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.681367 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-inventory" (OuterVolumeSpecName: "inventory") pod "527f58ea-f7a3-43c4-aeee-b22f40560466" (UID: "527f58ea-f7a3-43c4-aeee-b22f40560466"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.716019 4907 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.716072 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.716091 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fbkv\" (UniqueName: \"kubernetes.io/projected/527f58ea-f7a3-43c4-aeee-b22f40560466-kube-api-access-8fbkv\") on node \"crc\" DevicePath \"\"" Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.716107 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/527f58ea-f7a3-43c4-aeee-b22f40560466-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.883886 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" event={"ID":"527f58ea-f7a3-43c4-aeee-b22f40560466","Type":"ContainerDied","Data":"f910150124cde1c9aba437d0019aab862145dc363f6344811c8bcec83e3cd221"} Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.883944 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f910150124cde1c9aba437d0019aab862145dc363f6344811c8bcec83e3cd221" Nov 29 14:59:39 crc kubenswrapper[4907]: I1129 14:59:39.883954 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.032143 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww"] Nov 29 14:59:40 crc kubenswrapper[4907]: E1129 14:59:40.033160 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527f58ea-f7a3-43c4-aeee-b22f40560466" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.033189 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="527f58ea-f7a3-43c4-aeee-b22f40560466" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.033542 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="527f58ea-f7a3-43c4-aeee-b22f40560466" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.034614 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.038152 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.038188 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.041941 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.042148 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.072427 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww"] Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.123846 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c36b07ee-345f-4815-8cb9-25085e925d6a-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww\" (UID: \"c36b07ee-345f-4815-8cb9-25085e925d6a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.123921 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c36b07ee-345f-4815-8cb9-25085e925d6a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww\" (UID: \"c36b07ee-345f-4815-8cb9-25085e925d6a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.124216 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vzx8\" (UniqueName: \"kubernetes.io/projected/c36b07ee-345f-4815-8cb9-25085e925d6a-kube-api-access-8vzx8\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww\" (UID: \"c36b07ee-345f-4815-8cb9-25085e925d6a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.226804 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vzx8\" (UniqueName: \"kubernetes.io/projected/c36b07ee-345f-4815-8cb9-25085e925d6a-kube-api-access-8vzx8\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww\" (UID: \"c36b07ee-345f-4815-8cb9-25085e925d6a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.226995 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c36b07ee-345f-4815-8cb9-25085e925d6a-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww\" (UID: \"c36b07ee-345f-4815-8cb9-25085e925d6a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.227055 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c36b07ee-345f-4815-8cb9-25085e925d6a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww\" (UID: \"c36b07ee-345f-4815-8cb9-25085e925d6a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.232385 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c36b07ee-345f-4815-8cb9-25085e925d6a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww\" (UID: \"c36b07ee-345f-4815-8cb9-25085e925d6a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.234421 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c36b07ee-345f-4815-8cb9-25085e925d6a-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww\" (UID: \"c36b07ee-345f-4815-8cb9-25085e925d6a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.247651 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vzx8\" (UniqueName: \"kubernetes.io/projected/c36b07ee-345f-4815-8cb9-25085e925d6a-kube-api-access-8vzx8\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww\" (UID: \"c36b07ee-345f-4815-8cb9-25085e925d6a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.370016 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" Nov 29 14:59:40 crc kubenswrapper[4907]: I1129 14:59:40.973943 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww"] Nov 29 14:59:41 crc kubenswrapper[4907]: I1129 14:59:41.909063 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" event={"ID":"c36b07ee-345f-4815-8cb9-25085e925d6a","Type":"ContainerStarted","Data":"ca3a4ddf3f9b2adffe0179eeb6b3d3fecebf8e00108b33c42aca08b59e868989"} Nov 29 14:59:41 crc kubenswrapper[4907]: I1129 14:59:41.909411 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" event={"ID":"c36b07ee-345f-4815-8cb9-25085e925d6a","Type":"ContainerStarted","Data":"f5fe4ef236399c3bfcf881a7993de3da2c25d14cafe41101504bc3ab5108aa87"} Nov 29 14:59:41 crc kubenswrapper[4907]: I1129 14:59:41.944989 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" podStartSLOduration=2.466716364 podStartE2EDuration="2.94496732s" podCreationTimestamp="2025-11-29 14:59:39 +0000 UTC" firstStartedPulling="2025-11-29 14:59:40.939085269 +0000 UTC m=+1878.925922961" lastFinishedPulling="2025-11-29 14:59:41.417336265 +0000 UTC m=+1879.404173917" observedRunningTime="2025-11-29 14:59:41.934781193 +0000 UTC m=+1879.921618875" watchObservedRunningTime="2025-11-29 14:59:41.94496732 +0000 UTC m=+1879.931804982" Nov 29 14:59:50 crc kubenswrapper[4907]: I1129 14:59:50.480312 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 14:59:50 crc kubenswrapper[4907]: E1129 14:59:50.481706 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.165805 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m"] Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.167919 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.170754 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.176979 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.192085 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba0e8a79-8f7d-4441-85e7-616d33386673-secret-volume\") pod \"collect-profiles-29407140-2gq7m\" (UID: \"ba0e8a79-8f7d-4441-85e7-616d33386673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.192364 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsrgt\" (UniqueName: \"kubernetes.io/projected/ba0e8a79-8f7d-4441-85e7-616d33386673-kube-api-access-tsrgt\") pod \"collect-profiles-29407140-2gq7m\" (UID: \"ba0e8a79-8f7d-4441-85e7-616d33386673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.192430 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba0e8a79-8f7d-4441-85e7-616d33386673-config-volume\") pod \"collect-profiles-29407140-2gq7m\" (UID: \"ba0e8a79-8f7d-4441-85e7-616d33386673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.204096 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m"] Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.294913 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba0e8a79-8f7d-4441-85e7-616d33386673-config-volume\") pod \"collect-profiles-29407140-2gq7m\" (UID: \"ba0e8a79-8f7d-4441-85e7-616d33386673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.295152 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba0e8a79-8f7d-4441-85e7-616d33386673-secret-volume\") pod \"collect-profiles-29407140-2gq7m\" (UID: \"ba0e8a79-8f7d-4441-85e7-616d33386673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.295344 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsrgt\" (UniqueName: \"kubernetes.io/projected/ba0e8a79-8f7d-4441-85e7-616d33386673-kube-api-access-tsrgt\") pod \"collect-profiles-29407140-2gq7m\" (UID: \"ba0e8a79-8f7d-4441-85e7-616d33386673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.296071 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba0e8a79-8f7d-4441-85e7-616d33386673-config-volume\") pod \"collect-profiles-29407140-2gq7m\" (UID: \"ba0e8a79-8f7d-4441-85e7-616d33386673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.301305 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba0e8a79-8f7d-4441-85e7-616d33386673-secret-volume\") pod \"collect-profiles-29407140-2gq7m\" (UID: \"ba0e8a79-8f7d-4441-85e7-616d33386673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.313657 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsrgt\" (UniqueName: \"kubernetes.io/projected/ba0e8a79-8f7d-4441-85e7-616d33386673-kube-api-access-tsrgt\") pod \"collect-profiles-29407140-2gq7m\" (UID: \"ba0e8a79-8f7d-4441-85e7-616d33386673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.501036 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" Nov 29 15:00:00 crc kubenswrapper[4907]: I1129 15:00:00.987780 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m"] Nov 29 15:00:00 crc kubenswrapper[4907]: W1129 15:00:00.994591 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba0e8a79_8f7d_4441_85e7_616d33386673.slice/crio-92e8cda0a9149a37529b64f2a3ab48bad2c959553f23f2660ff5c7ed6e631df9 WatchSource:0}: Error finding container 92e8cda0a9149a37529b64f2a3ab48bad2c959553f23f2660ff5c7ed6e631df9: Status 404 returned error can't find the container with id 92e8cda0a9149a37529b64f2a3ab48bad2c959553f23f2660ff5c7ed6e631df9 Nov 29 15:00:01 crc kubenswrapper[4907]: I1129 15:00:01.170312 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" event={"ID":"ba0e8a79-8f7d-4441-85e7-616d33386673","Type":"ContainerStarted","Data":"92e8cda0a9149a37529b64f2a3ab48bad2c959553f23f2660ff5c7ed6e631df9"} Nov 29 15:00:02 crc kubenswrapper[4907]: I1129 15:00:02.199576 4907 generic.go:334] "Generic (PLEG): container finished" podID="ba0e8a79-8f7d-4441-85e7-616d33386673" containerID="77cb45b089dac74176bde7b90f34c9f927c06ea36d48eeac2820c23bcd119795" exitCode=0 Nov 29 15:00:02 crc kubenswrapper[4907]: I1129 15:00:02.199683 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" event={"ID":"ba0e8a79-8f7d-4441-85e7-616d33386673","Type":"ContainerDied","Data":"77cb45b089dac74176bde7b90f34c9f927c06ea36d48eeac2820c23bcd119795"} Nov 29 15:00:03 crc kubenswrapper[4907]: I1129 15:00:03.480246 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 15:00:03 crc kubenswrapper[4907]: I1129 15:00:03.744291 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" Nov 29 15:00:03 crc kubenswrapper[4907]: I1129 15:00:03.892417 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba0e8a79-8f7d-4441-85e7-616d33386673-secret-volume\") pod \"ba0e8a79-8f7d-4441-85e7-616d33386673\" (UID: \"ba0e8a79-8f7d-4441-85e7-616d33386673\") " Nov 29 15:00:03 crc kubenswrapper[4907]: I1129 15:00:03.892559 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsrgt\" (UniqueName: \"kubernetes.io/projected/ba0e8a79-8f7d-4441-85e7-616d33386673-kube-api-access-tsrgt\") pod \"ba0e8a79-8f7d-4441-85e7-616d33386673\" (UID: \"ba0e8a79-8f7d-4441-85e7-616d33386673\") " Nov 29 15:00:03 crc kubenswrapper[4907]: I1129 15:00:03.892721 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba0e8a79-8f7d-4441-85e7-616d33386673-config-volume\") pod \"ba0e8a79-8f7d-4441-85e7-616d33386673\" (UID: \"ba0e8a79-8f7d-4441-85e7-616d33386673\") " Nov 29 15:00:03 crc kubenswrapper[4907]: I1129 15:00:03.893538 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba0e8a79-8f7d-4441-85e7-616d33386673-config-volume" (OuterVolumeSpecName: "config-volume") pod "ba0e8a79-8f7d-4441-85e7-616d33386673" (UID: "ba0e8a79-8f7d-4441-85e7-616d33386673"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 15:00:03 crc kubenswrapper[4907]: I1129 15:00:03.893788 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba0e8a79-8f7d-4441-85e7-616d33386673-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 15:00:03 crc kubenswrapper[4907]: I1129 15:00:03.899756 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba0e8a79-8f7d-4441-85e7-616d33386673-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ba0e8a79-8f7d-4441-85e7-616d33386673" (UID: "ba0e8a79-8f7d-4441-85e7-616d33386673"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:00:03 crc kubenswrapper[4907]: I1129 15:00:03.905981 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba0e8a79-8f7d-4441-85e7-616d33386673-kube-api-access-tsrgt" (OuterVolumeSpecName: "kube-api-access-tsrgt") pod "ba0e8a79-8f7d-4441-85e7-616d33386673" (UID: "ba0e8a79-8f7d-4441-85e7-616d33386673"). InnerVolumeSpecName "kube-api-access-tsrgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:00:03 crc kubenswrapper[4907]: I1129 15:00:03.995960 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba0e8a79-8f7d-4441-85e7-616d33386673-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 15:00:03 crc kubenswrapper[4907]: I1129 15:00:03.995993 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsrgt\" (UniqueName: \"kubernetes.io/projected/ba0e8a79-8f7d-4441-85e7-616d33386673-kube-api-access-tsrgt\") on node \"crc\" DevicePath \"\"" Nov 29 15:00:04 crc kubenswrapper[4907]: I1129 15:00:04.230948 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"3f223864d0e084019513cae274c0d44a852a5d1a01e7ee167f1727b3298f32cf"} Nov 29 15:00:04 crc kubenswrapper[4907]: I1129 15:00:04.233313 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" event={"ID":"ba0e8a79-8f7d-4441-85e7-616d33386673","Type":"ContainerDied","Data":"92e8cda0a9149a37529b64f2a3ab48bad2c959553f23f2660ff5c7ed6e631df9"} Nov 29 15:00:04 crc kubenswrapper[4907]: I1129 15:00:04.233348 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92e8cda0a9149a37529b64f2a3ab48bad2c959553f23f2660ff5c7ed6e631df9" Nov 29 15:00:04 crc kubenswrapper[4907]: I1129 15:00:04.233353 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m" Nov 29 15:00:14 crc kubenswrapper[4907]: I1129 15:00:14.073796 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b489-account-create-update-ntbwq"] Nov 29 15:00:14 crc kubenswrapper[4907]: I1129 15:00:14.091363 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b489-account-create-update-ntbwq"] Nov 29 15:00:14 crc kubenswrapper[4907]: I1129 15:00:14.104090 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-b9xn8"] Nov 29 15:00:14 crc kubenswrapper[4907]: I1129 15:00:14.118498 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-kbxgc"] Nov 29 15:00:14 crc kubenswrapper[4907]: I1129 15:00:14.130378 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-kbxgc"] Nov 29 15:00:14 crc kubenswrapper[4907]: I1129 15:00:14.141190 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-b9xn8"] Nov 29 15:00:14 crc kubenswrapper[4907]: I1129 15:00:14.150903 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5da6-account-create-update-7jqbf"] Nov 29 15:00:14 crc kubenswrapper[4907]: I1129 15:00:14.159913 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5da6-account-create-update-7jqbf"] Nov 29 15:00:14 crc kubenswrapper[4907]: I1129 15:00:14.507896 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ab63941-4052-4105-b09e-2bd04a34ed2d" path="/var/lib/kubelet/pods/0ab63941-4052-4105-b09e-2bd04a34ed2d/volumes" Nov 29 15:00:14 crc kubenswrapper[4907]: I1129 15:00:14.511161 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a891cd77-26e1-42f2-bac1-dc68dd51d2d3" path="/var/lib/kubelet/pods/a891cd77-26e1-42f2-bac1-dc68dd51d2d3/volumes" Nov 29 15:00:14 crc kubenswrapper[4907]: I1129 15:00:14.512824 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c15eb212-5dab-4f9f-bea0-6f3899a36a8b" path="/var/lib/kubelet/pods/c15eb212-5dab-4f9f-bea0-6f3899a36a8b/volumes" Nov 29 15:00:14 crc kubenswrapper[4907]: I1129 15:00:14.515168 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a97393-4ffa-49d7-a070-aa2758fe10ed" path="/var/lib/kubelet/pods/f4a97393-4ffa-49d7-a070-aa2758fe10ed/volumes" Nov 29 15:00:17 crc kubenswrapper[4907]: I1129 15:00:17.051522 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6933-account-create-update-jjhlm"] Nov 29 15:00:17 crc kubenswrapper[4907]: I1129 15:00:17.071109 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6933-account-create-update-jjhlm"] Nov 29 15:00:18 crc kubenswrapper[4907]: I1129 15:00:18.042097 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-k9rqw"] Nov 29 15:00:18 crc kubenswrapper[4907]: I1129 15:00:18.052696 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-lfz28"] Nov 29 15:00:18 crc kubenswrapper[4907]: I1129 15:00:18.063748 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-b9b0-account-create-update-gkhzg"] Nov 29 15:00:18 crc kubenswrapper[4907]: I1129 15:00:18.074986 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-k9rqw"] Nov 29 15:00:18 crc kubenswrapper[4907]: I1129 15:00:18.084715 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-lfz28"] Nov 29 15:00:18 crc kubenswrapper[4907]: I1129 15:00:18.095063 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-b9b0-account-create-update-gkhzg"] Nov 29 15:00:18 crc kubenswrapper[4907]: I1129 15:00:18.497741 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dac16ae-ba20-4405-9e83-73dae3db6f5f" path="/var/lib/kubelet/pods/0dac16ae-ba20-4405-9e83-73dae3db6f5f/volumes" Nov 29 15:00:18 crc kubenswrapper[4907]: I1129 15:00:18.500424 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e46549-a315-4113-a0d2-5aafb96a7f12" path="/var/lib/kubelet/pods/18e46549-a315-4113-a0d2-5aafb96a7f12/volumes" Nov 29 15:00:18 crc kubenswrapper[4907]: I1129 15:00:18.502819 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f" path="/var/lib/kubelet/pods/5040bcc9-5a8d-4a9e-bdc6-b283a9ce828f/volumes" Nov 29 15:00:18 crc kubenswrapper[4907]: I1129 15:00:18.505857 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faad6940-2c08-4b7a-bdc0-b00def56a777" path="/var/lib/kubelet/pods/faad6940-2c08-4b7a-bdc0-b00def56a777/volumes" Nov 29 15:00:26 crc kubenswrapper[4907]: I1129 15:00:26.142678 4907 scope.go:117] "RemoveContainer" containerID="0013934d8d00b3c8ce7cfbb8b75c8c7f675d4f3dbef8ed7c8fdc96c66c4743ff" Nov 29 15:00:26 crc kubenswrapper[4907]: I1129 15:00:26.207868 4907 scope.go:117] "RemoveContainer" containerID="13f30d1a8b7841519241bf7e5ac0d1e54fcb9641ec7f4ffb445550a209028be1" Nov 29 15:00:26 crc kubenswrapper[4907]: I1129 15:00:26.270353 4907 scope.go:117] "RemoveContainer" containerID="4e712654f5ddcdfe5720ed2630869def36bb4b2548bd443cc1dd7d00126b9120" Nov 29 15:00:26 crc kubenswrapper[4907]: I1129 15:00:26.318219 4907 scope.go:117] "RemoveContainer" containerID="47e2b0cbe89453e8ced43cb2bf7f7f8c8cd98fb6d4430f91cc2daf4aae076804" Nov 29 15:00:26 crc kubenswrapper[4907]: I1129 15:00:26.368274 4907 scope.go:117] "RemoveContainer" containerID="9129b513c0f177b903b12982ef113951dfd5446c789a2b50a6e7901fb704ada9" Nov 29 15:00:26 crc kubenswrapper[4907]: I1129 15:00:26.417464 4907 scope.go:117] "RemoveContainer" containerID="3c11463cbe3643f16f5afcdeba176dc2fba8065478e6aeb4d300edda040e1205" Nov 29 15:00:26 crc kubenswrapper[4907]: I1129 15:00:26.453286 4907 scope.go:117] "RemoveContainer" containerID="6dd0bd74d1a2a61be5f1b19b14801f8c85942a65cee72f00839051d566dafa6b" Nov 29 15:00:26 crc kubenswrapper[4907]: I1129 15:00:26.484658 4907 scope.go:117] "RemoveContainer" containerID="3077e6a58d6e911bfd1d1aa471d710e78c3a4991eea2bfdfe50a5594c9ef9c64" Nov 29 15:00:26 crc kubenswrapper[4907]: I1129 15:00:26.523831 4907 scope.go:117] "RemoveContainer" containerID="3c508cc8b7bb90b5c9f338c6c452b78303a9c108d1b1a328b6142efc7f16d225" Nov 29 15:00:28 crc kubenswrapper[4907]: I1129 15:00:28.058057 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-nlrlx"] Nov 29 15:00:28 crc kubenswrapper[4907]: I1129 15:00:28.080251 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-nlrlx"] Nov 29 15:00:28 crc kubenswrapper[4907]: I1129 15:00:28.513778 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e7ec61-ebc8-4a53-be29-9243e33b6ca7" path="/var/lib/kubelet/pods/25e7ec61-ebc8-4a53-be29-9243e33b6ca7/volumes" Nov 29 15:00:29 crc kubenswrapper[4907]: I1129 15:00:29.054242 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-4mwp5"] Nov 29 15:00:29 crc kubenswrapper[4907]: I1129 15:00:29.069886 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-4mwp5"] Nov 29 15:00:30 crc kubenswrapper[4907]: I1129 15:00:30.505783 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d0cf369-d1c2-495e-82d9-31d8f75b3538" path="/var/lib/kubelet/pods/4d0cf369-d1c2-495e-82d9-31d8f75b3538/volumes" Nov 29 15:00:33 crc kubenswrapper[4907]: I1129 15:00:33.037396 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv"] Nov 29 15:00:33 crc kubenswrapper[4907]: I1129 15:00:33.054622 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-bf9d-account-create-update-q2tvd"] Nov 29 15:00:33 crc kubenswrapper[4907]: I1129 15:00:33.069624 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-5dzcv"] Nov 29 15:00:33 crc kubenswrapper[4907]: I1129 15:00:33.080615 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2bed-account-create-update-kkzsn"] Nov 29 15:00:33 crc kubenswrapper[4907]: I1129 15:00:33.090308 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-bf9d-account-create-update-q2tvd"] Nov 29 15:00:33 crc kubenswrapper[4907]: I1129 15:00:33.101689 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2bed-account-create-update-kkzsn"] Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.049423 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-72c9-account-create-update-27v75"] Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.066690 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-dsvvn"] Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.080267 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-4ea7-account-create-update-ft8pf"] Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.092762 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7e4b-account-create-update-k4vjz"] Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.102263 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-w9lsr"] Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.111095 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7e4b-account-create-update-k4vjz"] Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.120508 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-dsvvn"] Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.128948 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-72c9-account-create-update-27v75"] Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.137968 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-w9lsr"] Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.149239 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-4ea7-account-create-update-ft8pf"] Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.495038 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="083d8a82-cfe8-4bd9-b612-9466ca400e16" path="/var/lib/kubelet/pods/083d8a82-cfe8-4bd9-b612-9466ca400e16/volumes" Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.495856 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1359b44b-6aa8-48f7-98e0-faea12df3d79" path="/var/lib/kubelet/pods/1359b44b-6aa8-48f7-98e0-faea12df3d79/volumes" Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.496772 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="521498fd-fd08-4dc4-ba76-3a92a99cc7a1" path="/var/lib/kubelet/pods/521498fd-fd08-4dc4-ba76-3a92a99cc7a1/volumes" Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.498058 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c1c8d94-4154-402a-858d-fd819787ba8e" path="/var/lib/kubelet/pods/6c1c8d94-4154-402a-858d-fd819787ba8e/volumes" Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.499563 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6da37e82-34e6-45a1-a1b8-a373467376a9" path="/var/lib/kubelet/pods/6da37e82-34e6-45a1-a1b8-a373467376a9/volumes" Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.500694 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70d12fd2-0c0c-435c-863d-0b5445b67460" path="/var/lib/kubelet/pods/70d12fd2-0c0c-435c-863d-0b5445b67460/volumes" Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.501493 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331" path="/var/lib/kubelet/pods/9d4d85f5-e23c-49f0-bb24-ecd0f0ad5331/volumes" Nov 29 15:00:34 crc kubenswrapper[4907]: I1129 15:00:34.502982 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6ed5dab-4be9-4c17-934d-75ec8a900d7c" path="/var/lib/kubelet/pods/d6ed5dab-4be9-4c17-934d-75ec8a900d7c/volumes" Nov 29 15:00:43 crc kubenswrapper[4907]: I1129 15:00:43.030739 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-prtgl"] Nov 29 15:00:43 crc kubenswrapper[4907]: I1129 15:00:43.044745 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-prtgl"] Nov 29 15:00:44 crc kubenswrapper[4907]: I1129 15:00:44.504994 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b47a96b-53c6-4e4e-b92f-a0c12c5310b3" path="/var/lib/kubelet/pods/9b47a96b-53c6-4e4e-b92f-a0c12c5310b3/volumes" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.191789 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29407141-rgrbj"] Nov 29 15:01:00 crc kubenswrapper[4907]: E1129 15:01:00.194044 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba0e8a79-8f7d-4441-85e7-616d33386673" containerName="collect-profiles" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.194072 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba0e8a79-8f7d-4441-85e7-616d33386673" containerName="collect-profiles" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.194671 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba0e8a79-8f7d-4441-85e7-616d33386673" containerName="collect-profiles" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.197580 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.206223 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29407141-rgrbj"] Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.251872 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-fernet-keys\") pod \"keystone-cron-29407141-rgrbj\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.252153 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-combined-ca-bundle\") pod \"keystone-cron-29407141-rgrbj\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.252228 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w6dd\" (UniqueName: \"kubernetes.io/projected/55d5a03b-62e0-412d-97d5-99a260862255-kube-api-access-7w6dd\") pod \"keystone-cron-29407141-rgrbj\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.252504 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-config-data\") pod \"keystone-cron-29407141-rgrbj\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.355672 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-fernet-keys\") pod \"keystone-cron-29407141-rgrbj\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.355846 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-combined-ca-bundle\") pod \"keystone-cron-29407141-rgrbj\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.355882 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w6dd\" (UniqueName: \"kubernetes.io/projected/55d5a03b-62e0-412d-97d5-99a260862255-kube-api-access-7w6dd\") pod \"keystone-cron-29407141-rgrbj\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.355958 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-config-data\") pod \"keystone-cron-29407141-rgrbj\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.367705 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-combined-ca-bundle\") pod \"keystone-cron-29407141-rgrbj\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.374849 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-config-data\") pod \"keystone-cron-29407141-rgrbj\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.375110 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-fernet-keys\") pod \"keystone-cron-29407141-rgrbj\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.379074 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w6dd\" (UniqueName: \"kubernetes.io/projected/55d5a03b-62e0-412d-97d5-99a260862255-kube-api-access-7w6dd\") pod \"keystone-cron-29407141-rgrbj\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:00 crc kubenswrapper[4907]: I1129 15:01:00.535205 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:01 crc kubenswrapper[4907]: I1129 15:01:01.050168 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29407141-rgrbj"] Nov 29 15:01:01 crc kubenswrapper[4907]: I1129 15:01:01.192917 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29407141-rgrbj" event={"ID":"55d5a03b-62e0-412d-97d5-99a260862255","Type":"ContainerStarted","Data":"ac03e9a72526de13097a65aae69923994f6e854933bf83b246b7ff069bd91dec"} Nov 29 15:01:02 crc kubenswrapper[4907]: I1129 15:01:02.211072 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29407141-rgrbj" event={"ID":"55d5a03b-62e0-412d-97d5-99a260862255","Type":"ContainerStarted","Data":"5f1553c86d8ba0c94249ef148030591ddeb814b460893a6fd8af01f37f419653"} Nov 29 15:01:02 crc kubenswrapper[4907]: I1129 15:01:02.249968 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29407141-rgrbj" podStartSLOduration=2.249941538 podStartE2EDuration="2.249941538s" podCreationTimestamp="2025-11-29 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 15:01:02.232506826 +0000 UTC m=+1960.219344508" watchObservedRunningTime="2025-11-29 15:01:02.249941538 +0000 UTC m=+1960.236779230" Nov 29 15:01:04 crc kubenswrapper[4907]: I1129 15:01:04.232813 4907 generic.go:334] "Generic (PLEG): container finished" podID="55d5a03b-62e0-412d-97d5-99a260862255" containerID="5f1553c86d8ba0c94249ef148030591ddeb814b460893a6fd8af01f37f419653" exitCode=0 Nov 29 15:01:04 crc kubenswrapper[4907]: I1129 15:01:04.232901 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29407141-rgrbj" event={"ID":"55d5a03b-62e0-412d-97d5-99a260862255","Type":"ContainerDied","Data":"5f1553c86d8ba0c94249ef148030591ddeb814b460893a6fd8af01f37f419653"} Nov 29 15:01:05 crc kubenswrapper[4907]: I1129 15:01:05.746984 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:05 crc kubenswrapper[4907]: I1129 15:01:05.921355 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w6dd\" (UniqueName: \"kubernetes.io/projected/55d5a03b-62e0-412d-97d5-99a260862255-kube-api-access-7w6dd\") pod \"55d5a03b-62e0-412d-97d5-99a260862255\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " Nov 29 15:01:05 crc kubenswrapper[4907]: I1129 15:01:05.921424 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-config-data\") pod \"55d5a03b-62e0-412d-97d5-99a260862255\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " Nov 29 15:01:05 crc kubenswrapper[4907]: I1129 15:01:05.921775 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-fernet-keys\") pod \"55d5a03b-62e0-412d-97d5-99a260862255\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " Nov 29 15:01:05 crc kubenswrapper[4907]: I1129 15:01:05.921839 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-combined-ca-bundle\") pod \"55d5a03b-62e0-412d-97d5-99a260862255\" (UID: \"55d5a03b-62e0-412d-97d5-99a260862255\") " Nov 29 15:01:05 crc kubenswrapper[4907]: I1129 15:01:05.931391 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55d5a03b-62e0-412d-97d5-99a260862255-kube-api-access-7w6dd" (OuterVolumeSpecName: "kube-api-access-7w6dd") pod "55d5a03b-62e0-412d-97d5-99a260862255" (UID: "55d5a03b-62e0-412d-97d5-99a260862255"). InnerVolumeSpecName "kube-api-access-7w6dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:01:05 crc kubenswrapper[4907]: I1129 15:01:05.931527 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "55d5a03b-62e0-412d-97d5-99a260862255" (UID: "55d5a03b-62e0-412d-97d5-99a260862255"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:01:05 crc kubenswrapper[4907]: I1129 15:01:05.979503 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55d5a03b-62e0-412d-97d5-99a260862255" (UID: "55d5a03b-62e0-412d-97d5-99a260862255"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:01:06 crc kubenswrapper[4907]: I1129 15:01:06.025534 4907 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 29 15:01:06 crc kubenswrapper[4907]: I1129 15:01:06.025598 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 15:01:06 crc kubenswrapper[4907]: I1129 15:01:06.025630 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w6dd\" (UniqueName: \"kubernetes.io/projected/55d5a03b-62e0-412d-97d5-99a260862255-kube-api-access-7w6dd\") on node \"crc\" DevicePath \"\"" Nov 29 15:01:06 crc kubenswrapper[4907]: I1129 15:01:06.025912 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-config-data" (OuterVolumeSpecName: "config-data") pod "55d5a03b-62e0-412d-97d5-99a260862255" (UID: "55d5a03b-62e0-412d-97d5-99a260862255"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:01:06 crc kubenswrapper[4907]: I1129 15:01:06.128698 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55d5a03b-62e0-412d-97d5-99a260862255-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 15:01:06 crc kubenswrapper[4907]: I1129 15:01:06.273873 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29407141-rgrbj" event={"ID":"55d5a03b-62e0-412d-97d5-99a260862255","Type":"ContainerDied","Data":"ac03e9a72526de13097a65aae69923994f6e854933bf83b246b7ff069bd91dec"} Nov 29 15:01:06 crc kubenswrapper[4907]: I1129 15:01:06.273936 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac03e9a72526de13097a65aae69923994f6e854933bf83b246b7ff069bd91dec" Nov 29 15:01:06 crc kubenswrapper[4907]: I1129 15:01:06.274043 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29407141-rgrbj" Nov 29 15:01:18 crc kubenswrapper[4907]: I1129 15:01:18.074250 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-pwxst"] Nov 29 15:01:18 crc kubenswrapper[4907]: I1129 15:01:18.088028 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-pwxst"] Nov 29 15:01:18 crc kubenswrapper[4907]: I1129 15:01:18.496732 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d940ef0-0877-471d-906a-b6235392867d" path="/var/lib/kubelet/pods/8d940ef0-0877-471d-906a-b6235392867d/volumes" Nov 29 15:01:19 crc kubenswrapper[4907]: I1129 15:01:19.058494 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-4r59x"] Nov 29 15:01:19 crc kubenswrapper[4907]: I1129 15:01:19.075722 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-4r59x"] Nov 29 15:01:20 crc kubenswrapper[4907]: I1129 15:01:20.526513 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9d771c9-5533-4167-adeb-f77c429ded79" path="/var/lib/kubelet/pods/a9d771c9-5533-4167-adeb-f77c429ded79/volumes" Nov 29 15:01:26 crc kubenswrapper[4907]: I1129 15:01:26.830203 4907 scope.go:117] "RemoveContainer" containerID="0a473041a4a5bec0db20c0ffae9453e70991bd3141a2c075d2510200c2eea27c" Nov 29 15:01:26 crc kubenswrapper[4907]: I1129 15:01:26.876668 4907 scope.go:117] "RemoveContainer" containerID="88961f8842b754e34e4ac8b6b42e57fc64097ffe28f55d74f54fb6858ebd082a" Nov 29 15:01:26 crc kubenswrapper[4907]: I1129 15:01:26.955753 4907 scope.go:117] "RemoveContainer" containerID="e386dac190bea718b2ee91a443059e406fd22311ec85ee3b29866534fa051775" Nov 29 15:01:26 crc kubenswrapper[4907]: I1129 15:01:26.996124 4907 scope.go:117] "RemoveContainer" containerID="99c9a12314c0a1e32c91651ac01ece45aec45412ee520201491f13fb17cc2ac2" Nov 29 15:01:27 crc kubenswrapper[4907]: I1129 15:01:27.031970 4907 scope.go:117] "RemoveContainer" containerID="4bab71bb196e838c8325409c8adc8d7d8f80faf4927773e4e43f51e88ea532cc" Nov 29 15:01:27 crc kubenswrapper[4907]: I1129 15:01:27.089278 4907 scope.go:117] "RemoveContainer" containerID="0e44cb7d9e0cf607fab8de1dd8856075be28bf8047bd168a698aa30210ba4d03" Nov 29 15:01:27 crc kubenswrapper[4907]: I1129 15:01:27.138460 4907 scope.go:117] "RemoveContainer" containerID="dc5da379a81f564fd83c7672523bbf9843ea956a4c4babf58b27c97726b320d5" Nov 29 15:01:27 crc kubenswrapper[4907]: I1129 15:01:27.159795 4907 scope.go:117] "RemoveContainer" containerID="26fb0f8b8acacb5aea1b72987c70e47593b178c5c4865c168661bc3c60010fa5" Nov 29 15:01:27 crc kubenswrapper[4907]: I1129 15:01:27.190329 4907 scope.go:117] "RemoveContainer" containerID="d2edc9c9fb3ca2f844f121dfc2e6ed8f2336106e6e36b58f3e18e6ec1e24f83b" Nov 29 15:01:27 crc kubenswrapper[4907]: I1129 15:01:27.213240 4907 scope.go:117] "RemoveContainer" containerID="9b1f9500838f026044ed9ddd30bf803d4f95c6d20f5fb3ca895f5f2dbe98f598" Nov 29 15:01:27 crc kubenswrapper[4907]: I1129 15:01:27.239041 4907 scope.go:117] "RemoveContainer" containerID="47ca5682e6e0095166625dcc959b72109edbfc7c8cea1897f03314bc87ac9bf9" Nov 29 15:01:27 crc kubenswrapper[4907]: I1129 15:01:27.269557 4907 scope.go:117] "RemoveContainer" containerID="b944be8db7cc0750ea5e25b0a2d3ca806a386547d1944b7ffc6f2c350e953da2" Nov 29 15:01:27 crc kubenswrapper[4907]: I1129 15:01:27.306151 4907 scope.go:117] "RemoveContainer" containerID="9b26d73d2360b73483d19057cf34bc550a8f91e1cdbbfdf95b445b2fbd3d8419" Nov 29 15:01:34 crc kubenswrapper[4907]: I1129 15:01:34.055159 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-chkqj"] Nov 29 15:01:34 crc kubenswrapper[4907]: I1129 15:01:34.066480 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-chkqj"] Nov 29 15:01:34 crc kubenswrapper[4907]: I1129 15:01:34.079786 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-92jsx"] Nov 29 15:01:34 crc kubenswrapper[4907]: I1129 15:01:34.091144 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-6csw2"] Nov 29 15:01:34 crc kubenswrapper[4907]: I1129 15:01:34.101286 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-92jsx"] Nov 29 15:01:34 crc kubenswrapper[4907]: I1129 15:01:34.112381 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-6csw2"] Nov 29 15:01:34 crc kubenswrapper[4907]: I1129 15:01:34.507355 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0089fab2-d07c-4dad-bce1-a4c085a35d24" path="/var/lib/kubelet/pods/0089fab2-d07c-4dad-bce1-a4c085a35d24/volumes" Nov 29 15:01:34 crc kubenswrapper[4907]: I1129 15:01:34.510250 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a942b3b6-791f-4d18-abdd-113f3372158b" path="/var/lib/kubelet/pods/a942b3b6-791f-4d18-abdd-113f3372158b/volumes" Nov 29 15:01:34 crc kubenswrapper[4907]: I1129 15:01:34.512367 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de06d815-0165-4c7a-aeed-fda3a647ba27" path="/var/lib/kubelet/pods/de06d815-0165-4c7a-aeed-fda3a647ba27/volumes" Nov 29 15:01:44 crc kubenswrapper[4907]: I1129 15:01:44.917403 4907 generic.go:334] "Generic (PLEG): container finished" podID="c36b07ee-345f-4815-8cb9-25085e925d6a" containerID="ca3a4ddf3f9b2adffe0179eeb6b3d3fecebf8e00108b33c42aca08b59e868989" exitCode=0 Nov 29 15:01:44 crc kubenswrapper[4907]: I1129 15:01:44.918620 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" event={"ID":"c36b07ee-345f-4815-8cb9-25085e925d6a","Type":"ContainerDied","Data":"ca3a4ddf3f9b2adffe0179eeb6b3d3fecebf8e00108b33c42aca08b59e868989"} Nov 29 15:01:46 crc kubenswrapper[4907]: I1129 15:01:46.538452 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" Nov 29 15:01:46 crc kubenswrapper[4907]: I1129 15:01:46.644566 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c36b07ee-345f-4815-8cb9-25085e925d6a-inventory\") pod \"c36b07ee-345f-4815-8cb9-25085e925d6a\" (UID: \"c36b07ee-345f-4815-8cb9-25085e925d6a\") " Nov 29 15:01:46 crc kubenswrapper[4907]: I1129 15:01:46.644688 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vzx8\" (UniqueName: \"kubernetes.io/projected/c36b07ee-345f-4815-8cb9-25085e925d6a-kube-api-access-8vzx8\") pod \"c36b07ee-345f-4815-8cb9-25085e925d6a\" (UID: \"c36b07ee-345f-4815-8cb9-25085e925d6a\") " Nov 29 15:01:46 crc kubenswrapper[4907]: I1129 15:01:46.644744 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c36b07ee-345f-4815-8cb9-25085e925d6a-ssh-key\") pod \"c36b07ee-345f-4815-8cb9-25085e925d6a\" (UID: \"c36b07ee-345f-4815-8cb9-25085e925d6a\") " Nov 29 15:01:46 crc kubenswrapper[4907]: I1129 15:01:46.651751 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c36b07ee-345f-4815-8cb9-25085e925d6a-kube-api-access-8vzx8" (OuterVolumeSpecName: "kube-api-access-8vzx8") pod "c36b07ee-345f-4815-8cb9-25085e925d6a" (UID: "c36b07ee-345f-4815-8cb9-25085e925d6a"). InnerVolumeSpecName "kube-api-access-8vzx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:01:46 crc kubenswrapper[4907]: I1129 15:01:46.693976 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c36b07ee-345f-4815-8cb9-25085e925d6a-inventory" (OuterVolumeSpecName: "inventory") pod "c36b07ee-345f-4815-8cb9-25085e925d6a" (UID: "c36b07ee-345f-4815-8cb9-25085e925d6a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:01:46 crc kubenswrapper[4907]: I1129 15:01:46.704236 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c36b07ee-345f-4815-8cb9-25085e925d6a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c36b07ee-345f-4815-8cb9-25085e925d6a" (UID: "c36b07ee-345f-4815-8cb9-25085e925d6a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:01:46 crc kubenswrapper[4907]: I1129 15:01:46.748608 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c36b07ee-345f-4815-8cb9-25085e925d6a-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 15:01:46 crc kubenswrapper[4907]: I1129 15:01:46.748642 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vzx8\" (UniqueName: \"kubernetes.io/projected/c36b07ee-345f-4815-8cb9-25085e925d6a-kube-api-access-8vzx8\") on node \"crc\" DevicePath \"\"" Nov 29 15:01:46 crc kubenswrapper[4907]: I1129 15:01:46.748653 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c36b07ee-345f-4815-8cb9-25085e925d6a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 15:01:46 crc kubenswrapper[4907]: I1129 15:01:46.948471 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" event={"ID":"c36b07ee-345f-4815-8cb9-25085e925d6a","Type":"ContainerDied","Data":"f5fe4ef236399c3bfcf881a7993de3da2c25d14cafe41101504bc3ab5108aa87"} Nov 29 15:01:46 crc kubenswrapper[4907]: I1129 15:01:46.948774 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5fe4ef236399c3bfcf881a7993de3da2c25d14cafe41101504bc3ab5108aa87" Nov 29 15:01:46 crc kubenswrapper[4907]: I1129 15:01:46.948577 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.031056 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk"] Nov 29 15:01:47 crc kubenswrapper[4907]: E1129 15:01:47.031591 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d5a03b-62e0-412d-97d5-99a260862255" containerName="keystone-cron" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.031615 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d5a03b-62e0-412d-97d5-99a260862255" containerName="keystone-cron" Nov 29 15:01:47 crc kubenswrapper[4907]: E1129 15:01:47.031657 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c36b07ee-345f-4815-8cb9-25085e925d6a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.031667 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36b07ee-345f-4815-8cb9-25085e925d6a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.031958 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d5a03b-62e0-412d-97d5-99a260862255" containerName="keystone-cron" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.031984 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c36b07ee-345f-4815-8cb9-25085e925d6a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.032988 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.036388 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.036430 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.036398 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.037360 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.044310 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk"] Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.159788 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1266709f-ede6-4b61-b733-c40852501bb6-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk\" (UID: \"1266709f-ede6-4b61-b733-c40852501bb6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.159840 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1266709f-ede6-4b61-b733-c40852501bb6-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk\" (UID: \"1266709f-ede6-4b61-b733-c40852501bb6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.160218 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgr5v\" (UniqueName: \"kubernetes.io/projected/1266709f-ede6-4b61-b733-c40852501bb6-kube-api-access-mgr5v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk\" (UID: \"1266709f-ede6-4b61-b733-c40852501bb6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.262920 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgr5v\" (UniqueName: \"kubernetes.io/projected/1266709f-ede6-4b61-b733-c40852501bb6-kube-api-access-mgr5v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk\" (UID: \"1266709f-ede6-4b61-b733-c40852501bb6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.263419 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1266709f-ede6-4b61-b733-c40852501bb6-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk\" (UID: \"1266709f-ede6-4b61-b733-c40852501bb6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.263564 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1266709f-ede6-4b61-b733-c40852501bb6-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk\" (UID: \"1266709f-ede6-4b61-b733-c40852501bb6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.268454 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1266709f-ede6-4b61-b733-c40852501bb6-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk\" (UID: \"1266709f-ede6-4b61-b733-c40852501bb6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.280267 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1266709f-ede6-4b61-b733-c40852501bb6-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk\" (UID: \"1266709f-ede6-4b61-b733-c40852501bb6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.284486 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgr5v\" (UniqueName: \"kubernetes.io/projected/1266709f-ede6-4b61-b733-c40852501bb6-kube-api-access-mgr5v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk\" (UID: \"1266709f-ede6-4b61-b733-c40852501bb6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" Nov 29 15:01:47 crc kubenswrapper[4907]: I1129 15:01:47.348466 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" Nov 29 15:01:48 crc kubenswrapper[4907]: I1129 15:01:48.024065 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk"] Nov 29 15:01:48 crc kubenswrapper[4907]: I1129 15:01:48.029742 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 15:01:48 crc kubenswrapper[4907]: I1129 15:01:48.041347 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-8lc4r"] Nov 29 15:01:48 crc kubenswrapper[4907]: I1129 15:01:48.052379 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-8lc4r"] Nov 29 15:01:48 crc kubenswrapper[4907]: I1129 15:01:48.509711 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de" path="/var/lib/kubelet/pods/d1e527c4-b1c6-4f80-a37c-a6fe6ab8c5de/volumes" Nov 29 15:01:48 crc kubenswrapper[4907]: I1129 15:01:48.984989 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" event={"ID":"1266709f-ede6-4b61-b733-c40852501bb6","Type":"ContainerStarted","Data":"e19b0b907821fb01c52f85ddc0cdf77bdc56fe73a930f9fc675edaeb268d8294"} Nov 29 15:01:48 crc kubenswrapper[4907]: I1129 15:01:48.985289 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" event={"ID":"1266709f-ede6-4b61-b733-c40852501bb6","Type":"ContainerStarted","Data":"c6ca3568063651362c5f34c09efd30e0743c806fd68c1edbd53614a81b052c50"} Nov 29 15:01:49 crc kubenswrapper[4907]: I1129 15:01:49.009249 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" podStartSLOduration=1.450716276 podStartE2EDuration="2.009228272s" podCreationTimestamp="2025-11-29 15:01:47 +0000 UTC" firstStartedPulling="2025-11-29 15:01:48.029487196 +0000 UTC m=+2006.016324848" lastFinishedPulling="2025-11-29 15:01:48.587999202 +0000 UTC m=+2006.574836844" observedRunningTime="2025-11-29 15:01:49.008035378 +0000 UTC m=+2006.994873040" watchObservedRunningTime="2025-11-29 15:01:49.009228272 +0000 UTC m=+2006.996065934" Nov 29 15:02:20 crc kubenswrapper[4907]: I1129 15:02:20.061325 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-kdt5z"] Nov 29 15:02:20 crc kubenswrapper[4907]: I1129 15:02:20.076740 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-r68r9"] Nov 29 15:02:20 crc kubenswrapper[4907]: I1129 15:02:20.091558 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-r68r9"] Nov 29 15:02:20 crc kubenswrapper[4907]: I1129 15:02:20.102364 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-kdt5z"] Nov 29 15:02:20 crc kubenswrapper[4907]: I1129 15:02:20.522762 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03f910f6-7e96-4d81-bf3f-5b7291b2da09" path="/var/lib/kubelet/pods/03f910f6-7e96-4d81-bf3f-5b7291b2da09/volumes" Nov 29 15:02:20 crc kubenswrapper[4907]: I1129 15:02:20.524667 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e61658d-77a7-46d8-9718-b3077f6f5ff6" path="/var/lib/kubelet/pods/7e61658d-77a7-46d8-9718-b3077f6f5ff6/volumes" Nov 29 15:02:21 crc kubenswrapper[4907]: I1129 15:02:21.054839 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-z87qn"] Nov 29 15:02:21 crc kubenswrapper[4907]: I1129 15:02:21.070353 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-a773-account-create-update-8ppqm"] Nov 29 15:02:21 crc kubenswrapper[4907]: I1129 15:02:21.083669 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-4909-account-create-update-nsndx"] Nov 29 15:02:21 crc kubenswrapper[4907]: I1129 15:02:21.098325 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-fabb-account-create-update-wthdh"] Nov 29 15:02:21 crc kubenswrapper[4907]: I1129 15:02:21.115785 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-a773-account-create-update-8ppqm"] Nov 29 15:02:21 crc kubenswrapper[4907]: I1129 15:02:21.128114 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-z87qn"] Nov 29 15:02:21 crc kubenswrapper[4907]: I1129 15:02:21.140666 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-fabb-account-create-update-wthdh"] Nov 29 15:02:21 crc kubenswrapper[4907]: I1129 15:02:21.153113 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-4909-account-create-update-nsndx"] Nov 29 15:02:22 crc kubenswrapper[4907]: I1129 15:02:22.524048 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d19cf04-d99b-4e24-ae1d-f5bf69e3d926" path="/var/lib/kubelet/pods/0d19cf04-d99b-4e24-ae1d-f5bf69e3d926/volumes" Nov 29 15:02:22 crc kubenswrapper[4907]: I1129 15:02:22.525213 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ca437b-20a0-410b-b071-101b1ebe27cb" path="/var/lib/kubelet/pods/35ca437b-20a0-410b-b071-101b1ebe27cb/volumes" Nov 29 15:02:22 crc kubenswrapper[4907]: I1129 15:02:22.526411 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61df189f-119b-46fd-877c-87265a18f8c5" path="/var/lib/kubelet/pods/61df189f-119b-46fd-877c-87265a18f8c5/volumes" Nov 29 15:02:22 crc kubenswrapper[4907]: I1129 15:02:22.529149 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a01a3b28-aaff-4cda-9908-f08d0d675669" path="/var/lib/kubelet/pods/a01a3b28-aaff-4cda-9908-f08d0d675669/volumes" Nov 29 15:02:27 crc kubenswrapper[4907]: I1129 15:02:27.675165 4907 scope.go:117] "RemoveContainer" containerID="39a5a1f7616453bb870e47a8e4ac7c7bfe9eb960884396bf0b6e3d37581774be" Nov 29 15:02:27 crc kubenswrapper[4907]: I1129 15:02:27.714503 4907 scope.go:117] "RemoveContainer" containerID="63156ddbddad132ef95bfee0567526bc1dfc5f35237a842109c35d2633f01127" Nov 29 15:02:27 crc kubenswrapper[4907]: I1129 15:02:27.808128 4907 scope.go:117] "RemoveContainer" containerID="6f3c544987c7090c1b6dc149e2e9b8dfd7e51ae1a09aafdfef7b6b69d3838678" Nov 29 15:02:27 crc kubenswrapper[4907]: I1129 15:02:27.861671 4907 scope.go:117] "RemoveContainer" containerID="5bf0c625548a2fc71ba165eb7ac738064a7a77ab243cd0a4d8c97a01c5255df8" Nov 29 15:02:27 crc kubenswrapper[4907]: I1129 15:02:27.920249 4907 scope.go:117] "RemoveContainer" containerID="b68cb13406d0a8017927cd3d80d4e40ee73682af42ebeb1b67d5730700576161" Nov 29 15:02:27 crc kubenswrapper[4907]: I1129 15:02:27.982077 4907 scope.go:117] "RemoveContainer" containerID="61c63f7da93f96340fd0d6eb0c94c0c7d1b86db1fff9aac4c32497fef81c9ca5" Nov 29 15:02:28 crc kubenswrapper[4907]: I1129 15:02:28.028858 4907 scope.go:117] "RemoveContainer" containerID="c0a858029248c5eaacb368575a042e7b425868d82f5bbe1809bb92bbb43b7993" Nov 29 15:02:28 crc kubenswrapper[4907]: I1129 15:02:28.059176 4907 scope.go:117] "RemoveContainer" containerID="ac9f296b80c80c7e4491e969bf1720b0d272b9d26ceeb9092b3f0debe3ea4405" Nov 29 15:02:28 crc kubenswrapper[4907]: I1129 15:02:28.080300 4907 scope.go:117] "RemoveContainer" containerID="9b0ed0f0e53fc5a10c1673f198a2349d296f194971e6d10deaad5e2922bb0270" Nov 29 15:02:28 crc kubenswrapper[4907]: I1129 15:02:28.104970 4907 scope.go:117] "RemoveContainer" containerID="0bd911ebf8c3afdfa355fd7cbb762bd98e178dc4b1647485e73217b0b00edbab" Nov 29 15:02:28 crc kubenswrapper[4907]: I1129 15:02:28.491423 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:02:28 crc kubenswrapper[4907]: I1129 15:02:28.491661 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:02:58 crc kubenswrapper[4907]: I1129 15:02:58.490542 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:02:58 crc kubenswrapper[4907]: I1129 15:02:58.491174 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:03:03 crc kubenswrapper[4907]: I1129 15:03:03.059181 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dj5h6"] Nov 29 15:03:03 crc kubenswrapper[4907]: I1129 15:03:03.065250 4907 generic.go:334] "Generic (PLEG): container finished" podID="1266709f-ede6-4b61-b733-c40852501bb6" containerID="e19b0b907821fb01c52f85ddc0cdf77bdc56fe73a930f9fc675edaeb268d8294" exitCode=0 Nov 29 15:03:03 crc kubenswrapper[4907]: I1129 15:03:03.065301 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" event={"ID":"1266709f-ede6-4b61-b733-c40852501bb6","Type":"ContainerDied","Data":"e19b0b907821fb01c52f85ddc0cdf77bdc56fe73a930f9fc675edaeb268d8294"} Nov 29 15:03:03 crc kubenswrapper[4907]: I1129 15:03:03.070404 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dj5h6"] Nov 29 15:03:04 crc kubenswrapper[4907]: I1129 15:03:04.493501 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99873cd9-5727-4f0c-888d-ed6c6090abc1" path="/var/lib/kubelet/pods/99873cd9-5727-4f0c-888d-ed6c6090abc1/volumes" Nov 29 15:03:04 crc kubenswrapper[4907]: I1129 15:03:04.596542 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" Nov 29 15:03:04 crc kubenswrapper[4907]: I1129 15:03:04.774261 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgr5v\" (UniqueName: \"kubernetes.io/projected/1266709f-ede6-4b61-b733-c40852501bb6-kube-api-access-mgr5v\") pod \"1266709f-ede6-4b61-b733-c40852501bb6\" (UID: \"1266709f-ede6-4b61-b733-c40852501bb6\") " Nov 29 15:03:04 crc kubenswrapper[4907]: I1129 15:03:04.774384 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1266709f-ede6-4b61-b733-c40852501bb6-inventory\") pod \"1266709f-ede6-4b61-b733-c40852501bb6\" (UID: \"1266709f-ede6-4b61-b733-c40852501bb6\") " Nov 29 15:03:04 crc kubenswrapper[4907]: I1129 15:03:04.774617 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1266709f-ede6-4b61-b733-c40852501bb6-ssh-key\") pod \"1266709f-ede6-4b61-b733-c40852501bb6\" (UID: \"1266709f-ede6-4b61-b733-c40852501bb6\") " Nov 29 15:03:04 crc kubenswrapper[4907]: I1129 15:03:04.789241 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1266709f-ede6-4b61-b733-c40852501bb6-kube-api-access-mgr5v" (OuterVolumeSpecName: "kube-api-access-mgr5v") pod "1266709f-ede6-4b61-b733-c40852501bb6" (UID: "1266709f-ede6-4b61-b733-c40852501bb6"). InnerVolumeSpecName "kube-api-access-mgr5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:03:04 crc kubenswrapper[4907]: I1129 15:03:04.806349 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1266709f-ede6-4b61-b733-c40852501bb6-inventory" (OuterVolumeSpecName: "inventory") pod "1266709f-ede6-4b61-b733-c40852501bb6" (UID: "1266709f-ede6-4b61-b733-c40852501bb6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:03:04 crc kubenswrapper[4907]: I1129 15:03:04.820591 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1266709f-ede6-4b61-b733-c40852501bb6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1266709f-ede6-4b61-b733-c40852501bb6" (UID: "1266709f-ede6-4b61-b733-c40852501bb6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:03:04 crc kubenswrapper[4907]: I1129 15:03:04.879637 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1266709f-ede6-4b61-b733-c40852501bb6-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 15:03:04 crc kubenswrapper[4907]: I1129 15:03:04.879684 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1266709f-ede6-4b61-b733-c40852501bb6-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 15:03:04 crc kubenswrapper[4907]: I1129 15:03:04.879705 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgr5v\" (UniqueName: \"kubernetes.io/projected/1266709f-ede6-4b61-b733-c40852501bb6-kube-api-access-mgr5v\") on node \"crc\" DevicePath \"\"" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.104303 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" event={"ID":"1266709f-ede6-4b61-b733-c40852501bb6","Type":"ContainerDied","Data":"c6ca3568063651362c5f34c09efd30e0743c806fd68c1edbd53614a81b052c50"} Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.104368 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6ca3568063651362c5f34c09efd30e0743c806fd68c1edbd53614a81b052c50" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.104497 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.183257 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm"] Nov 29 15:03:05 crc kubenswrapper[4907]: E1129 15:03:05.184031 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1266709f-ede6-4b61-b733-c40852501bb6" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.184049 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1266709f-ede6-4b61-b733-c40852501bb6" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.184262 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1266709f-ede6-4b61-b733-c40852501bb6" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.185482 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.191013 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.191413 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.191570 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.191723 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.199062 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm"] Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.293552 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5661a247-cd8b-4001-bbf9-841c52c59abc-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d76lm\" (UID: \"5661a247-cd8b-4001-bbf9-841c52c59abc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.293772 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pj6j\" (UniqueName: \"kubernetes.io/projected/5661a247-cd8b-4001-bbf9-841c52c59abc-kube-api-access-2pj6j\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d76lm\" (UID: \"5661a247-cd8b-4001-bbf9-841c52c59abc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.293878 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5661a247-cd8b-4001-bbf9-841c52c59abc-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d76lm\" (UID: \"5661a247-cd8b-4001-bbf9-841c52c59abc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.396119 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5661a247-cd8b-4001-bbf9-841c52c59abc-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d76lm\" (UID: \"5661a247-cd8b-4001-bbf9-841c52c59abc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.396207 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pj6j\" (UniqueName: \"kubernetes.io/projected/5661a247-cd8b-4001-bbf9-841c52c59abc-kube-api-access-2pj6j\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d76lm\" (UID: \"5661a247-cd8b-4001-bbf9-841c52c59abc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.396230 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5661a247-cd8b-4001-bbf9-841c52c59abc-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d76lm\" (UID: \"5661a247-cd8b-4001-bbf9-841c52c59abc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.401451 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5661a247-cd8b-4001-bbf9-841c52c59abc-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d76lm\" (UID: \"5661a247-cd8b-4001-bbf9-841c52c59abc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.401494 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5661a247-cd8b-4001-bbf9-841c52c59abc-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d76lm\" (UID: \"5661a247-cd8b-4001-bbf9-841c52c59abc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.419012 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pj6j\" (UniqueName: \"kubernetes.io/projected/5661a247-cd8b-4001-bbf9-841c52c59abc-kube-api-access-2pj6j\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d76lm\" (UID: \"5661a247-cd8b-4001-bbf9-841c52c59abc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" Nov 29 15:03:05 crc kubenswrapper[4907]: I1129 15:03:05.511757 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" Nov 29 15:03:06 crc kubenswrapper[4907]: I1129 15:03:06.202279 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm"] Nov 29 15:03:07 crc kubenswrapper[4907]: I1129 15:03:07.148410 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" event={"ID":"5661a247-cd8b-4001-bbf9-841c52c59abc","Type":"ContainerStarted","Data":"6d15b4d8bf73ff559288f448cff3c737ad6de3a443ebc4e85e2c2ce3e98735a2"} Nov 29 15:03:08 crc kubenswrapper[4907]: I1129 15:03:08.159598 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" event={"ID":"5661a247-cd8b-4001-bbf9-841c52c59abc","Type":"ContainerStarted","Data":"b5ec662ee6d50e694c7e1736b764e3a4cd5df81d4f2c56c90a6343aecee1aea1"} Nov 29 15:03:08 crc kubenswrapper[4907]: I1129 15:03:08.179403 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" podStartSLOduration=2.350660779 podStartE2EDuration="3.179381892s" podCreationTimestamp="2025-11-29 15:03:05 +0000 UTC" firstStartedPulling="2025-11-29 15:03:06.204589888 +0000 UTC m=+2084.191427540" lastFinishedPulling="2025-11-29 15:03:07.033310991 +0000 UTC m=+2085.020148653" observedRunningTime="2025-11-29 15:03:08.175008899 +0000 UTC m=+2086.161846551" watchObservedRunningTime="2025-11-29 15:03:08.179381892 +0000 UTC m=+2086.166219544" Nov 29 15:03:12 crc kubenswrapper[4907]: I1129 15:03:12.216801 4907 generic.go:334] "Generic (PLEG): container finished" podID="5661a247-cd8b-4001-bbf9-841c52c59abc" containerID="b5ec662ee6d50e694c7e1736b764e3a4cd5df81d4f2c56c90a6343aecee1aea1" exitCode=0 Nov 29 15:03:12 crc kubenswrapper[4907]: I1129 15:03:12.217069 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" event={"ID":"5661a247-cd8b-4001-bbf9-841c52c59abc","Type":"ContainerDied","Data":"b5ec662ee6d50e694c7e1736b764e3a4cd5df81d4f2c56c90a6343aecee1aea1"} Nov 29 15:03:13 crc kubenswrapper[4907]: I1129 15:03:13.745893 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" Nov 29 15:03:13 crc kubenswrapper[4907]: I1129 15:03:13.929148 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5661a247-cd8b-4001-bbf9-841c52c59abc-inventory\") pod \"5661a247-cd8b-4001-bbf9-841c52c59abc\" (UID: \"5661a247-cd8b-4001-bbf9-841c52c59abc\") " Nov 29 15:03:13 crc kubenswrapper[4907]: I1129 15:03:13.929250 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5661a247-cd8b-4001-bbf9-841c52c59abc-ssh-key\") pod \"5661a247-cd8b-4001-bbf9-841c52c59abc\" (UID: \"5661a247-cd8b-4001-bbf9-841c52c59abc\") " Nov 29 15:03:13 crc kubenswrapper[4907]: I1129 15:03:13.929283 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pj6j\" (UniqueName: \"kubernetes.io/projected/5661a247-cd8b-4001-bbf9-841c52c59abc-kube-api-access-2pj6j\") pod \"5661a247-cd8b-4001-bbf9-841c52c59abc\" (UID: \"5661a247-cd8b-4001-bbf9-841c52c59abc\") " Nov 29 15:03:13 crc kubenswrapper[4907]: I1129 15:03:13.940968 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5661a247-cd8b-4001-bbf9-841c52c59abc-kube-api-access-2pj6j" (OuterVolumeSpecName: "kube-api-access-2pj6j") pod "5661a247-cd8b-4001-bbf9-841c52c59abc" (UID: "5661a247-cd8b-4001-bbf9-841c52c59abc"). InnerVolumeSpecName "kube-api-access-2pj6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:03:13 crc kubenswrapper[4907]: I1129 15:03:13.969182 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5661a247-cd8b-4001-bbf9-841c52c59abc-inventory" (OuterVolumeSpecName: "inventory") pod "5661a247-cd8b-4001-bbf9-841c52c59abc" (UID: "5661a247-cd8b-4001-bbf9-841c52c59abc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.001730 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5661a247-cd8b-4001-bbf9-841c52c59abc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5661a247-cd8b-4001-bbf9-841c52c59abc" (UID: "5661a247-cd8b-4001-bbf9-841c52c59abc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.033028 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5661a247-cd8b-4001-bbf9-841c52c59abc-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.033068 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5661a247-cd8b-4001-bbf9-841c52c59abc-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.033081 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pj6j\" (UniqueName: \"kubernetes.io/projected/5661a247-cd8b-4001-bbf9-841c52c59abc-kube-api-access-2pj6j\") on node \"crc\" DevicePath \"\"" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.258121 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" event={"ID":"5661a247-cd8b-4001-bbf9-841c52c59abc","Type":"ContainerDied","Data":"6d15b4d8bf73ff559288f448cff3c737ad6de3a443ebc4e85e2c2ce3e98735a2"} Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.258175 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d15b4d8bf73ff559288f448cff3c737ad6de3a443ebc4e85e2c2ce3e98735a2" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.258216 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d76lm" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.371645 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj"] Nov 29 15:03:14 crc kubenswrapper[4907]: E1129 15:03:14.372941 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5661a247-cd8b-4001-bbf9-841c52c59abc" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.372996 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5661a247-cd8b-4001-bbf9-841c52c59abc" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.373713 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5661a247-cd8b-4001-bbf9-841c52c59abc" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.375886 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.381340 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.381617 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.381807 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.383313 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.386042 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj"] Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.547836 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/79c3b8e5-9000-49cd-a62c-ae366a7592b0-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b9cj\" (UID: \"79c3b8e5-9000-49cd-a62c-ae366a7592b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.548197 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79c3b8e5-9000-49cd-a62c-ae366a7592b0-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b9cj\" (UID: \"79c3b8e5-9000-49cd-a62c-ae366a7592b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.548476 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bthkw\" (UniqueName: \"kubernetes.io/projected/79c3b8e5-9000-49cd-a62c-ae366a7592b0-kube-api-access-bthkw\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b9cj\" (UID: \"79c3b8e5-9000-49cd-a62c-ae366a7592b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.650459 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79c3b8e5-9000-49cd-a62c-ae366a7592b0-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b9cj\" (UID: \"79c3b8e5-9000-49cd-a62c-ae366a7592b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.650588 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bthkw\" (UniqueName: \"kubernetes.io/projected/79c3b8e5-9000-49cd-a62c-ae366a7592b0-kube-api-access-bthkw\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b9cj\" (UID: \"79c3b8e5-9000-49cd-a62c-ae366a7592b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.650723 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/79c3b8e5-9000-49cd-a62c-ae366a7592b0-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b9cj\" (UID: \"79c3b8e5-9000-49cd-a62c-ae366a7592b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.658076 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79c3b8e5-9000-49cd-a62c-ae366a7592b0-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b9cj\" (UID: \"79c3b8e5-9000-49cd-a62c-ae366a7592b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.658195 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/79c3b8e5-9000-49cd-a62c-ae366a7592b0-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b9cj\" (UID: \"79c3b8e5-9000-49cd-a62c-ae366a7592b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.670091 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bthkw\" (UniqueName: \"kubernetes.io/projected/79c3b8e5-9000-49cd-a62c-ae366a7592b0-kube-api-access-bthkw\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7b9cj\" (UID: \"79c3b8e5-9000-49cd-a62c-ae366a7592b0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" Nov 29 15:03:14 crc kubenswrapper[4907]: I1129 15:03:14.709993 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" Nov 29 15:03:15 crc kubenswrapper[4907]: I1129 15:03:15.317628 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj"] Nov 29 15:03:16 crc kubenswrapper[4907]: I1129 15:03:16.281065 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" event={"ID":"79c3b8e5-9000-49cd-a62c-ae366a7592b0","Type":"ContainerStarted","Data":"148c9633012e27c2685e66566a7eea6688ff4af46d3294d77a6144cbe2c3596a"} Nov 29 15:03:16 crc kubenswrapper[4907]: I1129 15:03:16.281631 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" event={"ID":"79c3b8e5-9000-49cd-a62c-ae366a7592b0","Type":"ContainerStarted","Data":"496a8746340de822aab4e6418863612531a77c85964c41c13580806b6fc4358e"} Nov 29 15:03:16 crc kubenswrapper[4907]: I1129 15:03:16.300298 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" podStartSLOduration=1.6573873479999999 podStartE2EDuration="2.300283375s" podCreationTimestamp="2025-11-29 15:03:14 +0000 UTC" firstStartedPulling="2025-11-29 15:03:15.321097115 +0000 UTC m=+2093.307934777" lastFinishedPulling="2025-11-29 15:03:15.963993142 +0000 UTC m=+2093.950830804" observedRunningTime="2025-11-29 15:03:16.296007524 +0000 UTC m=+2094.282845196" watchObservedRunningTime="2025-11-29 15:03:16.300283375 +0000 UTC m=+2094.287121027" Nov 29 15:03:28 crc kubenswrapper[4907]: I1129 15:03:28.054311 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-gghpv"] Nov 29 15:03:28 crc kubenswrapper[4907]: I1129 15:03:28.063886 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-gghpv"] Nov 29 15:03:28 crc kubenswrapper[4907]: I1129 15:03:28.393060 4907 scope.go:117] "RemoveContainer" containerID="400e32d48bbb34dc2b5c4d40c4e020a9a4145508b813d144fa5a8ba986bbfdcb" Nov 29 15:03:28 crc kubenswrapper[4907]: I1129 15:03:28.489993 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:03:28 crc kubenswrapper[4907]: I1129 15:03:28.490052 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:03:28 crc kubenswrapper[4907]: I1129 15:03:28.530140 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9701e823-f11a-4ce6-85de-80f705092f11" path="/var/lib/kubelet/pods/9701e823-f11a-4ce6-85de-80f705092f11/volumes" Nov 29 15:03:28 crc kubenswrapper[4907]: I1129 15:03:28.531330 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 15:03:28 crc kubenswrapper[4907]: I1129 15:03:28.532045 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f223864d0e084019513cae274c0d44a852a5d1a01e7ee167f1727b3298f32cf"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 15:03:28 crc kubenswrapper[4907]: I1129 15:03:28.532128 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://3f223864d0e084019513cae274c0d44a852a5d1a01e7ee167f1727b3298f32cf" gracePeriod=600 Nov 29 15:03:29 crc kubenswrapper[4907]: I1129 15:03:29.478083 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="3f223864d0e084019513cae274c0d44a852a5d1a01e7ee167f1727b3298f32cf" exitCode=0 Nov 29 15:03:29 crc kubenswrapper[4907]: I1129 15:03:29.478132 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"3f223864d0e084019513cae274c0d44a852a5d1a01e7ee167f1727b3298f32cf"} Nov 29 15:03:29 crc kubenswrapper[4907]: I1129 15:03:29.478659 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7"} Nov 29 15:03:29 crc kubenswrapper[4907]: I1129 15:03:29.478835 4907 scope.go:117] "RemoveContainer" containerID="166f6852d7e5ad1280d0e225383f71ed5e39e549898b37e77909182ba92a965f" Nov 29 15:03:30 crc kubenswrapper[4907]: I1129 15:03:30.028476 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4v7qp"] Nov 29 15:03:30 crc kubenswrapper[4907]: I1129 15:03:30.039150 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4v7qp"] Nov 29 15:03:30 crc kubenswrapper[4907]: I1129 15:03:30.499005 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="424d2988-9e7a-460f-b621-8268a86daaa5" path="/var/lib/kubelet/pods/424d2988-9e7a-460f-b621-8268a86daaa5/volumes" Nov 29 15:03:33 crc kubenswrapper[4907]: I1129 15:03:33.038576 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-56fkz"] Nov 29 15:03:33 crc kubenswrapper[4907]: I1129 15:03:33.052046 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-56fkz"] Nov 29 15:03:33 crc kubenswrapper[4907]: I1129 15:03:33.064151 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0272-account-create-update-zc6hq"] Nov 29 15:03:33 crc kubenswrapper[4907]: I1129 15:03:33.075496 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0272-account-create-update-zc6hq"] Nov 29 15:03:34 crc kubenswrapper[4907]: I1129 15:03:34.500241 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a4137fc-d9f8-46ae-9740-cf388fcb54f1" path="/var/lib/kubelet/pods/5a4137fc-d9f8-46ae-9740-cf388fcb54f1/volumes" Nov 29 15:03:34 crc kubenswrapper[4907]: I1129 15:03:34.502476 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd168f22-6342-4cce-95b4-3793763c8b41" path="/var/lib/kubelet/pods/bd168f22-6342-4cce-95b4-3793763c8b41/volumes" Nov 29 15:03:42 crc kubenswrapper[4907]: I1129 15:03:42.926716 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7xmj5"] Nov 29 15:03:42 crc kubenswrapper[4907]: I1129 15:03:42.930145 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:03:42 crc kubenswrapper[4907]: I1129 15:03:42.938824 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7xmj5"] Nov 29 15:03:43 crc kubenswrapper[4907]: I1129 15:03:43.041926 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-utilities\") pod \"redhat-operators-7xmj5\" (UID: \"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99\") " pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:03:43 crc kubenswrapper[4907]: I1129 15:03:43.041999 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cts5s\" (UniqueName: \"kubernetes.io/projected/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-kube-api-access-cts5s\") pod \"redhat-operators-7xmj5\" (UID: \"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99\") " pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:03:43 crc kubenswrapper[4907]: I1129 15:03:43.042584 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-catalog-content\") pod \"redhat-operators-7xmj5\" (UID: \"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99\") " pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:03:43 crc kubenswrapper[4907]: I1129 15:03:43.144262 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-catalog-content\") pod \"redhat-operators-7xmj5\" (UID: \"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99\") " pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:03:43 crc kubenswrapper[4907]: I1129 15:03:43.144335 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-utilities\") pod \"redhat-operators-7xmj5\" (UID: \"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99\") " pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:03:43 crc kubenswrapper[4907]: I1129 15:03:43.144385 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cts5s\" (UniqueName: \"kubernetes.io/projected/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-kube-api-access-cts5s\") pod \"redhat-operators-7xmj5\" (UID: \"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99\") " pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:03:43 crc kubenswrapper[4907]: I1129 15:03:43.145129 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-catalog-content\") pod \"redhat-operators-7xmj5\" (UID: \"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99\") " pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:03:43 crc kubenswrapper[4907]: I1129 15:03:43.145344 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-utilities\") pod \"redhat-operators-7xmj5\" (UID: \"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99\") " pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:03:43 crc kubenswrapper[4907]: I1129 15:03:43.173300 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cts5s\" (UniqueName: \"kubernetes.io/projected/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-kube-api-access-cts5s\") pod \"redhat-operators-7xmj5\" (UID: \"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99\") " pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:03:43 crc kubenswrapper[4907]: I1129 15:03:43.256905 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:03:43 crc kubenswrapper[4907]: I1129 15:03:43.825240 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7xmj5"] Nov 29 15:03:44 crc kubenswrapper[4907]: I1129 15:03:44.721914 4907 generic.go:334] "Generic (PLEG): container finished" podID="a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" containerID="ad06d18e49b84f79ce4ebe60f48c350874c48b753309126275fd307f065708ba" exitCode=0 Nov 29 15:03:44 crc kubenswrapper[4907]: I1129 15:03:44.722034 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xmj5" event={"ID":"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99","Type":"ContainerDied","Data":"ad06d18e49b84f79ce4ebe60f48c350874c48b753309126275fd307f065708ba"} Nov 29 15:03:44 crc kubenswrapper[4907]: I1129 15:03:44.722440 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xmj5" event={"ID":"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99","Type":"ContainerStarted","Data":"0ed442d5ed728e6b059a904ae7f3ebdceb55ec4abd923280e24d5c8500c40a46"} Nov 29 15:03:45 crc kubenswrapper[4907]: I1129 15:03:45.733105 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xmj5" event={"ID":"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99","Type":"ContainerStarted","Data":"0a8e960c87f6d83cfb2b5f5652a441d9a7ef0666e9ef734f86059f0e948ef79d"} Nov 29 15:03:48 crc kubenswrapper[4907]: I1129 15:03:48.770771 4907 generic.go:334] "Generic (PLEG): container finished" podID="a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" containerID="0a8e960c87f6d83cfb2b5f5652a441d9a7ef0666e9ef734f86059f0e948ef79d" exitCode=0 Nov 29 15:03:48 crc kubenswrapper[4907]: I1129 15:03:48.770839 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xmj5" event={"ID":"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99","Type":"ContainerDied","Data":"0a8e960c87f6d83cfb2b5f5652a441d9a7ef0666e9ef734f86059f0e948ef79d"} Nov 29 15:03:49 crc kubenswrapper[4907]: I1129 15:03:49.786934 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xmj5" event={"ID":"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99","Type":"ContainerStarted","Data":"d8329f4da84d68e91cbcf01ac60c30192b917398f408676a5cb50b36b1e6e3b9"} Nov 29 15:03:49 crc kubenswrapper[4907]: I1129 15:03:49.821211 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7xmj5" podStartSLOduration=3.138106153 podStartE2EDuration="7.821186164s" podCreationTimestamp="2025-11-29 15:03:42 +0000 UTC" firstStartedPulling="2025-11-29 15:03:44.724315963 +0000 UTC m=+2122.711153615" lastFinishedPulling="2025-11-29 15:03:49.407395964 +0000 UTC m=+2127.394233626" observedRunningTime="2025-11-29 15:03:49.810641166 +0000 UTC m=+2127.797478818" watchObservedRunningTime="2025-11-29 15:03:49.821186164 +0000 UTC m=+2127.808023846" Nov 29 15:03:53 crc kubenswrapper[4907]: I1129 15:03:53.257833 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:03:53 crc kubenswrapper[4907]: I1129 15:03:53.258364 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:03:54 crc kubenswrapper[4907]: I1129 15:03:54.333036 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7xmj5" podUID="a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" containerName="registry-server" probeResult="failure" output=< Nov 29 15:03:54 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 15:03:54 crc kubenswrapper[4907]: > Nov 29 15:03:57 crc kubenswrapper[4907]: I1129 15:03:57.879081 4907 generic.go:334] "Generic (PLEG): container finished" podID="79c3b8e5-9000-49cd-a62c-ae366a7592b0" containerID="148c9633012e27c2685e66566a7eea6688ff4af46d3294d77a6144cbe2c3596a" exitCode=0 Nov 29 15:03:57 crc kubenswrapper[4907]: I1129 15:03:57.879149 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" event={"ID":"79c3b8e5-9000-49cd-a62c-ae366a7592b0","Type":"ContainerDied","Data":"148c9633012e27c2685e66566a7eea6688ff4af46d3294d77a6144cbe2c3596a"} Nov 29 15:03:59 crc kubenswrapper[4907]: I1129 15:03:59.381668 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" Nov 29 15:03:59 crc kubenswrapper[4907]: I1129 15:03:59.410976 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/79c3b8e5-9000-49cd-a62c-ae366a7592b0-ssh-key\") pod \"79c3b8e5-9000-49cd-a62c-ae366a7592b0\" (UID: \"79c3b8e5-9000-49cd-a62c-ae366a7592b0\") " Nov 29 15:03:59 crc kubenswrapper[4907]: I1129 15:03:59.411032 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bthkw\" (UniqueName: \"kubernetes.io/projected/79c3b8e5-9000-49cd-a62c-ae366a7592b0-kube-api-access-bthkw\") pod \"79c3b8e5-9000-49cd-a62c-ae366a7592b0\" (UID: \"79c3b8e5-9000-49cd-a62c-ae366a7592b0\") " Nov 29 15:03:59 crc kubenswrapper[4907]: I1129 15:03:59.411362 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79c3b8e5-9000-49cd-a62c-ae366a7592b0-inventory\") pod \"79c3b8e5-9000-49cd-a62c-ae366a7592b0\" (UID: \"79c3b8e5-9000-49cd-a62c-ae366a7592b0\") " Nov 29 15:03:59 crc kubenswrapper[4907]: I1129 15:03:59.445151 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79c3b8e5-9000-49cd-a62c-ae366a7592b0-kube-api-access-bthkw" (OuterVolumeSpecName: "kube-api-access-bthkw") pod "79c3b8e5-9000-49cd-a62c-ae366a7592b0" (UID: "79c3b8e5-9000-49cd-a62c-ae366a7592b0"). InnerVolumeSpecName "kube-api-access-bthkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:03:59 crc kubenswrapper[4907]: I1129 15:03:59.476613 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79c3b8e5-9000-49cd-a62c-ae366a7592b0-inventory" (OuterVolumeSpecName: "inventory") pod "79c3b8e5-9000-49cd-a62c-ae366a7592b0" (UID: "79c3b8e5-9000-49cd-a62c-ae366a7592b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:03:59 crc kubenswrapper[4907]: I1129 15:03:59.481921 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79c3b8e5-9000-49cd-a62c-ae366a7592b0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "79c3b8e5-9000-49cd-a62c-ae366a7592b0" (UID: "79c3b8e5-9000-49cd-a62c-ae366a7592b0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:03:59 crc kubenswrapper[4907]: I1129 15:03:59.514950 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79c3b8e5-9000-49cd-a62c-ae366a7592b0-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 15:03:59 crc kubenswrapper[4907]: I1129 15:03:59.514986 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/79c3b8e5-9000-49cd-a62c-ae366a7592b0-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 15:03:59 crc kubenswrapper[4907]: I1129 15:03:59.514999 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bthkw\" (UniqueName: \"kubernetes.io/projected/79c3b8e5-9000-49cd-a62c-ae366a7592b0-kube-api-access-bthkw\") on node \"crc\" DevicePath \"\"" Nov 29 15:03:59 crc kubenswrapper[4907]: I1129 15:03:59.903027 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" event={"ID":"79c3b8e5-9000-49cd-a62c-ae366a7592b0","Type":"ContainerDied","Data":"496a8746340de822aab4e6418863612531a77c85964c41c13580806b6fc4358e"} Nov 29 15:03:59 crc kubenswrapper[4907]: I1129 15:03:59.903070 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="496a8746340de822aab4e6418863612531a77c85964c41c13580806b6fc4358e" Nov 29 15:03:59 crc kubenswrapper[4907]: I1129 15:03:59.903105 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7b9cj" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.061859 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt"] Nov 29 15:04:00 crc kubenswrapper[4907]: E1129 15:04:00.062353 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c3b8e5-9000-49cd-a62c-ae366a7592b0" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.062372 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c3b8e5-9000-49cd-a62c-ae366a7592b0" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.062624 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="79c3b8e5-9000-49cd-a62c-ae366a7592b0" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.063386 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.068061 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.068258 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.078819 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt"] Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.079474 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.079518 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.129412 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj6fv\" (UniqueName: \"kubernetes.io/projected/711bdb9b-2232-4b77-83b1-8501049d68cc-kube-api-access-mj6fv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt\" (UID: \"711bdb9b-2232-4b77-83b1-8501049d68cc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.129646 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/711bdb9b-2232-4b77-83b1-8501049d68cc-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt\" (UID: \"711bdb9b-2232-4b77-83b1-8501049d68cc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.129685 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/711bdb9b-2232-4b77-83b1-8501049d68cc-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt\" (UID: \"711bdb9b-2232-4b77-83b1-8501049d68cc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.231468 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/711bdb9b-2232-4b77-83b1-8501049d68cc-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt\" (UID: \"711bdb9b-2232-4b77-83b1-8501049d68cc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.231695 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/711bdb9b-2232-4b77-83b1-8501049d68cc-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt\" (UID: \"711bdb9b-2232-4b77-83b1-8501049d68cc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.231792 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj6fv\" (UniqueName: \"kubernetes.io/projected/711bdb9b-2232-4b77-83b1-8501049d68cc-kube-api-access-mj6fv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt\" (UID: \"711bdb9b-2232-4b77-83b1-8501049d68cc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.236462 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/711bdb9b-2232-4b77-83b1-8501049d68cc-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt\" (UID: \"711bdb9b-2232-4b77-83b1-8501049d68cc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.236577 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/711bdb9b-2232-4b77-83b1-8501049d68cc-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt\" (UID: \"711bdb9b-2232-4b77-83b1-8501049d68cc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.248750 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj6fv\" (UniqueName: \"kubernetes.io/projected/711bdb9b-2232-4b77-83b1-8501049d68cc-kube-api-access-mj6fv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt\" (UID: \"711bdb9b-2232-4b77-83b1-8501049d68cc\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" Nov 29 15:04:00 crc kubenswrapper[4907]: I1129 15:04:00.397228 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" Nov 29 15:04:01 crc kubenswrapper[4907]: I1129 15:04:01.010513 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt"] Nov 29 15:04:01 crc kubenswrapper[4907]: W1129 15:04:01.012471 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod711bdb9b_2232_4b77_83b1_8501049d68cc.slice/crio-ec7c8061a801e46e63c17cb5ef02442a35d39bac3a645c482c1b7331cb7e2336 WatchSource:0}: Error finding container ec7c8061a801e46e63c17cb5ef02442a35d39bac3a645c482c1b7331cb7e2336: Status 404 returned error can't find the container with id ec7c8061a801e46e63c17cb5ef02442a35d39bac3a645c482c1b7331cb7e2336 Nov 29 15:04:01 crc kubenswrapper[4907]: I1129 15:04:01.950221 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" event={"ID":"711bdb9b-2232-4b77-83b1-8501049d68cc","Type":"ContainerStarted","Data":"6585b872069bfdd6da0c855bfdde9a16186295a47af654a3c879bc27f0d74129"} Nov 29 15:04:01 crc kubenswrapper[4907]: I1129 15:04:01.950754 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" event={"ID":"711bdb9b-2232-4b77-83b1-8501049d68cc","Type":"ContainerStarted","Data":"ec7c8061a801e46e63c17cb5ef02442a35d39bac3a645c482c1b7331cb7e2336"} Nov 29 15:04:01 crc kubenswrapper[4907]: I1129 15:04:01.970222 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" podStartSLOduration=1.537812115 podStartE2EDuration="1.97020254s" podCreationTimestamp="2025-11-29 15:04:00 +0000 UTC" firstStartedPulling="2025-11-29 15:04:01.017717754 +0000 UTC m=+2139.004555406" lastFinishedPulling="2025-11-29 15:04:01.450108169 +0000 UTC m=+2139.436945831" observedRunningTime="2025-11-29 15:04:01.965311892 +0000 UTC m=+2139.952149574" watchObservedRunningTime="2025-11-29 15:04:01.97020254 +0000 UTC m=+2139.957040192" Nov 29 15:04:04 crc kubenswrapper[4907]: I1129 15:04:04.326247 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7xmj5" podUID="a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" containerName="registry-server" probeResult="failure" output=< Nov 29 15:04:04 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 15:04:04 crc kubenswrapper[4907]: > Nov 29 15:04:13 crc kubenswrapper[4907]: I1129 15:04:13.072490 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-dhqrp"] Nov 29 15:04:13 crc kubenswrapper[4907]: I1129 15:04:13.088020 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-dhqrp"] Nov 29 15:04:13 crc kubenswrapper[4907]: I1129 15:04:13.323206 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:04:13 crc kubenswrapper[4907]: I1129 15:04:13.405106 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:04:14 crc kubenswrapper[4907]: I1129 15:04:14.118624 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7xmj5"] Nov 29 15:04:14 crc kubenswrapper[4907]: I1129 15:04:14.508240 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6" path="/var/lib/kubelet/pods/5d92c9d8-fedb-4ab8-ac88-7e6a3b0738c6/volumes" Nov 29 15:04:15 crc kubenswrapper[4907]: I1129 15:04:15.117908 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7xmj5" podUID="a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" containerName="registry-server" containerID="cri-o://d8329f4da84d68e91cbcf01ac60c30192b917398f408676a5cb50b36b1e6e3b9" gracePeriod=2 Nov 29 15:04:15 crc kubenswrapper[4907]: E1129 15:04:15.271597 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda54c4dfc_19d7_41a4_bf70_56cb2a0d4a99.slice/crio-conmon-d8329f4da84d68e91cbcf01ac60c30192b917398f408676a5cb50b36b1e6e3b9.scope\": RecentStats: unable to find data in memory cache]" Nov 29 15:04:15 crc kubenswrapper[4907]: I1129 15:04:15.797802 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:04:15 crc kubenswrapper[4907]: I1129 15:04:15.881398 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cts5s\" (UniqueName: \"kubernetes.io/projected/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-kube-api-access-cts5s\") pod \"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99\" (UID: \"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99\") " Nov 29 15:04:15 crc kubenswrapper[4907]: I1129 15:04:15.881504 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-catalog-content\") pod \"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99\" (UID: \"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99\") " Nov 29 15:04:15 crc kubenswrapper[4907]: I1129 15:04:15.881644 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-utilities\") pod \"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99\" (UID: \"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99\") " Nov 29 15:04:15 crc kubenswrapper[4907]: I1129 15:04:15.882582 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-utilities" (OuterVolumeSpecName: "utilities") pod "a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" (UID: "a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:04:15 crc kubenswrapper[4907]: I1129 15:04:15.908773 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-kube-api-access-cts5s" (OuterVolumeSpecName: "kube-api-access-cts5s") pod "a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" (UID: "a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99"). InnerVolumeSpecName "kube-api-access-cts5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:04:15 crc kubenswrapper[4907]: I1129 15:04:15.983831 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:04:15 crc kubenswrapper[4907]: I1129 15:04:15.984037 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cts5s\" (UniqueName: \"kubernetes.io/projected/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-kube-api-access-cts5s\") on node \"crc\" DevicePath \"\"" Nov 29 15:04:15 crc kubenswrapper[4907]: I1129 15:04:15.987533 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" (UID: "a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.086374 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.129140 4907 generic.go:334] "Generic (PLEG): container finished" podID="a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" containerID="d8329f4da84d68e91cbcf01ac60c30192b917398f408676a5cb50b36b1e6e3b9" exitCode=0 Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.129180 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xmj5" event={"ID":"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99","Type":"ContainerDied","Data":"d8329f4da84d68e91cbcf01ac60c30192b917398f408676a5cb50b36b1e6e3b9"} Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.129210 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xmj5" event={"ID":"a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99","Type":"ContainerDied","Data":"0ed442d5ed728e6b059a904ae7f3ebdceb55ec4abd923280e24d5c8500c40a46"} Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.129226 4907 scope.go:117] "RemoveContainer" containerID="d8329f4da84d68e91cbcf01ac60c30192b917398f408676a5cb50b36b1e6e3b9" Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.129552 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7xmj5" Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.160637 4907 scope.go:117] "RemoveContainer" containerID="0a8e960c87f6d83cfb2b5f5652a441d9a7ef0666e9ef734f86059f0e948ef79d" Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.178093 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7xmj5"] Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.204488 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7xmj5"] Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.222366 4907 scope.go:117] "RemoveContainer" containerID="ad06d18e49b84f79ce4ebe60f48c350874c48b753309126275fd307f065708ba" Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.293524 4907 scope.go:117] "RemoveContainer" containerID="d8329f4da84d68e91cbcf01ac60c30192b917398f408676a5cb50b36b1e6e3b9" Nov 29 15:04:16 crc kubenswrapper[4907]: E1129 15:04:16.295378 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8329f4da84d68e91cbcf01ac60c30192b917398f408676a5cb50b36b1e6e3b9\": container with ID starting with d8329f4da84d68e91cbcf01ac60c30192b917398f408676a5cb50b36b1e6e3b9 not found: ID does not exist" containerID="d8329f4da84d68e91cbcf01ac60c30192b917398f408676a5cb50b36b1e6e3b9" Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.295426 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8329f4da84d68e91cbcf01ac60c30192b917398f408676a5cb50b36b1e6e3b9"} err="failed to get container status \"d8329f4da84d68e91cbcf01ac60c30192b917398f408676a5cb50b36b1e6e3b9\": rpc error: code = NotFound desc = could not find container \"d8329f4da84d68e91cbcf01ac60c30192b917398f408676a5cb50b36b1e6e3b9\": container with ID starting with d8329f4da84d68e91cbcf01ac60c30192b917398f408676a5cb50b36b1e6e3b9 not found: ID does not exist" Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.295482 4907 scope.go:117] "RemoveContainer" containerID="0a8e960c87f6d83cfb2b5f5652a441d9a7ef0666e9ef734f86059f0e948ef79d" Nov 29 15:04:16 crc kubenswrapper[4907]: E1129 15:04:16.295868 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a8e960c87f6d83cfb2b5f5652a441d9a7ef0666e9ef734f86059f0e948ef79d\": container with ID starting with 0a8e960c87f6d83cfb2b5f5652a441d9a7ef0666e9ef734f86059f0e948ef79d not found: ID does not exist" containerID="0a8e960c87f6d83cfb2b5f5652a441d9a7ef0666e9ef734f86059f0e948ef79d" Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.295896 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8e960c87f6d83cfb2b5f5652a441d9a7ef0666e9ef734f86059f0e948ef79d"} err="failed to get container status \"0a8e960c87f6d83cfb2b5f5652a441d9a7ef0666e9ef734f86059f0e948ef79d\": rpc error: code = NotFound desc = could not find container \"0a8e960c87f6d83cfb2b5f5652a441d9a7ef0666e9ef734f86059f0e948ef79d\": container with ID starting with 0a8e960c87f6d83cfb2b5f5652a441d9a7ef0666e9ef734f86059f0e948ef79d not found: ID does not exist" Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.295913 4907 scope.go:117] "RemoveContainer" containerID="ad06d18e49b84f79ce4ebe60f48c350874c48b753309126275fd307f065708ba" Nov 29 15:04:16 crc kubenswrapper[4907]: E1129 15:04:16.296351 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad06d18e49b84f79ce4ebe60f48c350874c48b753309126275fd307f065708ba\": container with ID starting with ad06d18e49b84f79ce4ebe60f48c350874c48b753309126275fd307f065708ba not found: ID does not exist" containerID="ad06d18e49b84f79ce4ebe60f48c350874c48b753309126275fd307f065708ba" Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.296392 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad06d18e49b84f79ce4ebe60f48c350874c48b753309126275fd307f065708ba"} err="failed to get container status \"ad06d18e49b84f79ce4ebe60f48c350874c48b753309126275fd307f065708ba\": rpc error: code = NotFound desc = could not find container \"ad06d18e49b84f79ce4ebe60f48c350874c48b753309126275fd307f065708ba\": container with ID starting with ad06d18e49b84f79ce4ebe60f48c350874c48b753309126275fd307f065708ba not found: ID does not exist" Nov 29 15:04:16 crc kubenswrapper[4907]: I1129 15:04:16.495560 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" path="/var/lib/kubelet/pods/a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99/volumes" Nov 29 15:04:28 crc kubenswrapper[4907]: I1129 15:04:28.498500 4907 scope.go:117] "RemoveContainer" containerID="27ea6a2ac4f48b7ceb965a7dea9a7a669e9fb4b5a46bd4ed3b5bf057a4414d8c" Nov 29 15:04:28 crc kubenswrapper[4907]: I1129 15:04:28.523721 4907 scope.go:117] "RemoveContainer" containerID="5bccaa7969a6878d47cce9db04567f931f270aa24332b73632764fd38abdaf16" Nov 29 15:04:28 crc kubenswrapper[4907]: I1129 15:04:28.603135 4907 scope.go:117] "RemoveContainer" containerID="92b054cfe0a423f39cbc1c63cedaf00c15b521d76ff5f25c7b86233d09047ffb" Nov 29 15:04:28 crc kubenswrapper[4907]: I1129 15:04:28.661637 4907 scope.go:117] "RemoveContainer" containerID="f5fe780a64700774a76b6303d085ae8ab2b8872fcd14c24eb01a002df6dcf958" Nov 29 15:04:28 crc kubenswrapper[4907]: I1129 15:04:28.708858 4907 scope.go:117] "RemoveContainer" containerID="e803d9379264c0df1c46bd27b3a7bafe84c3fc57c02f60339de6c2958b84bf58" Nov 29 15:05:00 crc kubenswrapper[4907]: I1129 15:05:00.767432 4907 generic.go:334] "Generic (PLEG): container finished" podID="711bdb9b-2232-4b77-83b1-8501049d68cc" containerID="6585b872069bfdd6da0c855bfdde9a16186295a47af654a3c879bc27f0d74129" exitCode=0 Nov 29 15:05:00 crc kubenswrapper[4907]: I1129 15:05:00.767603 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" event={"ID":"711bdb9b-2232-4b77-83b1-8501049d68cc","Type":"ContainerDied","Data":"6585b872069bfdd6da0c855bfdde9a16186295a47af654a3c879bc27f0d74129"} Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.379712 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.534642 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/711bdb9b-2232-4b77-83b1-8501049d68cc-ssh-key\") pod \"711bdb9b-2232-4b77-83b1-8501049d68cc\" (UID: \"711bdb9b-2232-4b77-83b1-8501049d68cc\") " Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.534695 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/711bdb9b-2232-4b77-83b1-8501049d68cc-inventory\") pod \"711bdb9b-2232-4b77-83b1-8501049d68cc\" (UID: \"711bdb9b-2232-4b77-83b1-8501049d68cc\") " Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.534724 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj6fv\" (UniqueName: \"kubernetes.io/projected/711bdb9b-2232-4b77-83b1-8501049d68cc-kube-api-access-mj6fv\") pod \"711bdb9b-2232-4b77-83b1-8501049d68cc\" (UID: \"711bdb9b-2232-4b77-83b1-8501049d68cc\") " Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.554950 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/711bdb9b-2232-4b77-83b1-8501049d68cc-kube-api-access-mj6fv" (OuterVolumeSpecName: "kube-api-access-mj6fv") pod "711bdb9b-2232-4b77-83b1-8501049d68cc" (UID: "711bdb9b-2232-4b77-83b1-8501049d68cc"). InnerVolumeSpecName "kube-api-access-mj6fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.580332 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/711bdb9b-2232-4b77-83b1-8501049d68cc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "711bdb9b-2232-4b77-83b1-8501049d68cc" (UID: "711bdb9b-2232-4b77-83b1-8501049d68cc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.585471 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/711bdb9b-2232-4b77-83b1-8501049d68cc-inventory" (OuterVolumeSpecName: "inventory") pod "711bdb9b-2232-4b77-83b1-8501049d68cc" (UID: "711bdb9b-2232-4b77-83b1-8501049d68cc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.637621 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/711bdb9b-2232-4b77-83b1-8501049d68cc-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.637655 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/711bdb9b-2232-4b77-83b1-8501049d68cc-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.637665 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj6fv\" (UniqueName: \"kubernetes.io/projected/711bdb9b-2232-4b77-83b1-8501049d68cc-kube-api-access-mj6fv\") on node \"crc\" DevicePath \"\"" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.797665 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" event={"ID":"711bdb9b-2232-4b77-83b1-8501049d68cc","Type":"ContainerDied","Data":"ec7c8061a801e46e63c17cb5ef02442a35d39bac3a645c482c1b7331cb7e2336"} Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.797727 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec7c8061a801e46e63c17cb5ef02442a35d39bac3a645c482c1b7331cb7e2336" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.797737 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.903774 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5tzc9"] Nov 29 15:05:02 crc kubenswrapper[4907]: E1129 15:05:02.904377 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" containerName="extract-utilities" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.904397 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" containerName="extract-utilities" Nov 29 15:05:02 crc kubenswrapper[4907]: E1129 15:05:02.904416 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711bdb9b-2232-4b77-83b1-8501049d68cc" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.904426 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="711bdb9b-2232-4b77-83b1-8501049d68cc" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 29 15:05:02 crc kubenswrapper[4907]: E1129 15:05:02.904456 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" containerName="registry-server" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.904465 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" containerName="registry-server" Nov 29 15:05:02 crc kubenswrapper[4907]: E1129 15:05:02.904478 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" containerName="extract-content" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.904484 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" containerName="extract-content" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.904722 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a54c4dfc-19d7-41a4-bf70-56cb2a0d4a99" containerName="registry-server" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.904758 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="711bdb9b-2232-4b77-83b1-8501049d68cc" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.905550 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.908390 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.908428 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.908457 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.908546 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 15:05:02 crc kubenswrapper[4907]: I1129 15:05:02.925264 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5tzc9"] Nov 29 15:05:03 crc kubenswrapper[4907]: I1129 15:05:03.048274 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d3aef26e-6dd9-447d-b445-09b8c9b80935-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5tzc9\" (UID: \"d3aef26e-6dd9-447d-b445-09b8c9b80935\") " pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" Nov 29 15:05:03 crc kubenswrapper[4907]: I1129 15:05:03.048566 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7p6p\" (UniqueName: \"kubernetes.io/projected/d3aef26e-6dd9-447d-b445-09b8c9b80935-kube-api-access-v7p6p\") pod \"ssh-known-hosts-edpm-deployment-5tzc9\" (UID: \"d3aef26e-6dd9-447d-b445-09b8c9b80935\") " pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" Nov 29 15:05:03 crc kubenswrapper[4907]: I1129 15:05:03.048752 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3aef26e-6dd9-447d-b445-09b8c9b80935-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5tzc9\" (UID: \"d3aef26e-6dd9-447d-b445-09b8c9b80935\") " pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" Nov 29 15:05:03 crc kubenswrapper[4907]: I1129 15:05:03.151201 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7p6p\" (UniqueName: \"kubernetes.io/projected/d3aef26e-6dd9-447d-b445-09b8c9b80935-kube-api-access-v7p6p\") pod \"ssh-known-hosts-edpm-deployment-5tzc9\" (UID: \"d3aef26e-6dd9-447d-b445-09b8c9b80935\") " pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" Nov 29 15:05:03 crc kubenswrapper[4907]: I1129 15:05:03.151423 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3aef26e-6dd9-447d-b445-09b8c9b80935-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5tzc9\" (UID: \"d3aef26e-6dd9-447d-b445-09b8c9b80935\") " pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" Nov 29 15:05:03 crc kubenswrapper[4907]: I1129 15:05:03.151665 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d3aef26e-6dd9-447d-b445-09b8c9b80935-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5tzc9\" (UID: \"d3aef26e-6dd9-447d-b445-09b8c9b80935\") " pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" Nov 29 15:05:03 crc kubenswrapper[4907]: I1129 15:05:03.156081 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3aef26e-6dd9-447d-b445-09b8c9b80935-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5tzc9\" (UID: \"d3aef26e-6dd9-447d-b445-09b8c9b80935\") " pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" Nov 29 15:05:03 crc kubenswrapper[4907]: I1129 15:05:03.158247 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d3aef26e-6dd9-447d-b445-09b8c9b80935-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5tzc9\" (UID: \"d3aef26e-6dd9-447d-b445-09b8c9b80935\") " pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" Nov 29 15:05:03 crc kubenswrapper[4907]: I1129 15:05:03.175744 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7p6p\" (UniqueName: \"kubernetes.io/projected/d3aef26e-6dd9-447d-b445-09b8c9b80935-kube-api-access-v7p6p\") pod \"ssh-known-hosts-edpm-deployment-5tzc9\" (UID: \"d3aef26e-6dd9-447d-b445-09b8c9b80935\") " pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" Nov 29 15:05:03 crc kubenswrapper[4907]: I1129 15:05:03.234672 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" Nov 29 15:05:03 crc kubenswrapper[4907]: I1129 15:05:03.894745 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5tzc9"] Nov 29 15:05:04 crc kubenswrapper[4907]: I1129 15:05:04.826399 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" event={"ID":"d3aef26e-6dd9-447d-b445-09b8c9b80935","Type":"ContainerStarted","Data":"15f349ece4b6a1897265c8e68df52ccf5aa263109ec573db677cbf59b107f1a6"} Nov 29 15:05:04 crc kubenswrapper[4907]: I1129 15:05:04.826790 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" event={"ID":"d3aef26e-6dd9-447d-b445-09b8c9b80935","Type":"ContainerStarted","Data":"44ea8a0415251586f2a5c8b5330806e0804475ee1d4b0e018b6c8c7891942a31"} Nov 29 15:05:04 crc kubenswrapper[4907]: I1129 15:05:04.876476 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" podStartSLOduration=2.367869734 podStartE2EDuration="2.870991866s" podCreationTimestamp="2025-11-29 15:05:02 +0000 UTC" firstStartedPulling="2025-11-29 15:05:03.898555786 +0000 UTC m=+2201.885393428" lastFinishedPulling="2025-11-29 15:05:04.401677878 +0000 UTC m=+2202.388515560" observedRunningTime="2025-11-29 15:05:04.868979589 +0000 UTC m=+2202.855817251" watchObservedRunningTime="2025-11-29 15:05:04.870991866 +0000 UTC m=+2202.857829528" Nov 29 15:05:12 crc kubenswrapper[4907]: I1129 15:05:12.948250 4907 generic.go:334] "Generic (PLEG): container finished" podID="d3aef26e-6dd9-447d-b445-09b8c9b80935" containerID="15f349ece4b6a1897265c8e68df52ccf5aa263109ec573db677cbf59b107f1a6" exitCode=0 Nov 29 15:05:12 crc kubenswrapper[4907]: I1129 15:05:12.948408 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" event={"ID":"d3aef26e-6dd9-447d-b445-09b8c9b80935","Type":"ContainerDied","Data":"15f349ece4b6a1897265c8e68df52ccf5aa263109ec573db677cbf59b107f1a6"} Nov 29 15:05:14 crc kubenswrapper[4907]: I1129 15:05:14.576340 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" Nov 29 15:05:14 crc kubenswrapper[4907]: I1129 15:05:14.670247 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3aef26e-6dd9-447d-b445-09b8c9b80935-ssh-key-openstack-edpm-ipam\") pod \"d3aef26e-6dd9-447d-b445-09b8c9b80935\" (UID: \"d3aef26e-6dd9-447d-b445-09b8c9b80935\") " Nov 29 15:05:14 crc kubenswrapper[4907]: I1129 15:05:14.670392 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d3aef26e-6dd9-447d-b445-09b8c9b80935-inventory-0\") pod \"d3aef26e-6dd9-447d-b445-09b8c9b80935\" (UID: \"d3aef26e-6dd9-447d-b445-09b8c9b80935\") " Nov 29 15:05:14 crc kubenswrapper[4907]: I1129 15:05:14.670527 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7p6p\" (UniqueName: \"kubernetes.io/projected/d3aef26e-6dd9-447d-b445-09b8c9b80935-kube-api-access-v7p6p\") pod \"d3aef26e-6dd9-447d-b445-09b8c9b80935\" (UID: \"d3aef26e-6dd9-447d-b445-09b8c9b80935\") " Nov 29 15:05:14 crc kubenswrapper[4907]: I1129 15:05:14.681678 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3aef26e-6dd9-447d-b445-09b8c9b80935-kube-api-access-v7p6p" (OuterVolumeSpecName: "kube-api-access-v7p6p") pod "d3aef26e-6dd9-447d-b445-09b8c9b80935" (UID: "d3aef26e-6dd9-447d-b445-09b8c9b80935"). InnerVolumeSpecName "kube-api-access-v7p6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:05:14 crc kubenswrapper[4907]: I1129 15:05:14.722972 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3aef26e-6dd9-447d-b445-09b8c9b80935-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d3aef26e-6dd9-447d-b445-09b8c9b80935" (UID: "d3aef26e-6dd9-447d-b445-09b8c9b80935"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:05:14 crc kubenswrapper[4907]: I1129 15:05:14.743926 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3aef26e-6dd9-447d-b445-09b8c9b80935-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "d3aef26e-6dd9-447d-b445-09b8c9b80935" (UID: "d3aef26e-6dd9-447d-b445-09b8c9b80935"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:05:14 crc kubenswrapper[4907]: I1129 15:05:14.771984 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3aef26e-6dd9-447d-b445-09b8c9b80935-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 29 15:05:14 crc kubenswrapper[4907]: I1129 15:05:14.772016 4907 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d3aef26e-6dd9-447d-b445-09b8c9b80935-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:05:14 crc kubenswrapper[4907]: I1129 15:05:14.772026 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7p6p\" (UniqueName: \"kubernetes.io/projected/d3aef26e-6dd9-447d-b445-09b8c9b80935-kube-api-access-v7p6p\") on node \"crc\" DevicePath \"\"" Nov 29 15:05:14 crc kubenswrapper[4907]: I1129 15:05:14.979294 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" event={"ID":"d3aef26e-6dd9-447d-b445-09b8c9b80935","Type":"ContainerDied","Data":"44ea8a0415251586f2a5c8b5330806e0804475ee1d4b0e018b6c8c7891942a31"} Nov 29 15:05:14 crc kubenswrapper[4907]: I1129 15:05:14.979684 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44ea8a0415251586f2a5c8b5330806e0804475ee1d4b0e018b6c8c7891942a31" Nov 29 15:05:14 crc kubenswrapper[4907]: I1129 15:05:14.979383 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5tzc9" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.109312 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv"] Nov 29 15:05:15 crc kubenswrapper[4907]: E1129 15:05:15.109967 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3aef26e-6dd9-447d-b445-09b8c9b80935" containerName="ssh-known-hosts-edpm-deployment" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.109994 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3aef26e-6dd9-447d-b445-09b8c9b80935" containerName="ssh-known-hosts-edpm-deployment" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.110385 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3aef26e-6dd9-447d-b445-09b8c9b80935" containerName="ssh-known-hosts-edpm-deployment" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.111615 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.115408 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.115624 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.115838 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.116310 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.124996 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv"] Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.183510 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d9sbv\" (UID: \"17f0e654-fdc3-4250-9e2e-bf7cb21e7175\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.183611 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d9sbv\" (UID: \"17f0e654-fdc3-4250-9e2e-bf7cb21e7175\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.183654 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt7q6\" (UniqueName: \"kubernetes.io/projected/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-kube-api-access-pt7q6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d9sbv\" (UID: \"17f0e654-fdc3-4250-9e2e-bf7cb21e7175\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.287007 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d9sbv\" (UID: \"17f0e654-fdc3-4250-9e2e-bf7cb21e7175\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.287098 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d9sbv\" (UID: \"17f0e654-fdc3-4250-9e2e-bf7cb21e7175\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.287130 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt7q6\" (UniqueName: \"kubernetes.io/projected/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-kube-api-access-pt7q6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d9sbv\" (UID: \"17f0e654-fdc3-4250-9e2e-bf7cb21e7175\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.293004 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d9sbv\" (UID: \"17f0e654-fdc3-4250-9e2e-bf7cb21e7175\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.293287 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d9sbv\" (UID: \"17f0e654-fdc3-4250-9e2e-bf7cb21e7175\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.312249 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt7q6\" (UniqueName: \"kubernetes.io/projected/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-kube-api-access-pt7q6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d9sbv\" (UID: \"17f0e654-fdc3-4250-9e2e-bf7cb21e7175\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" Nov 29 15:05:15 crc kubenswrapper[4907]: I1129 15:05:15.435265 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" Nov 29 15:05:16 crc kubenswrapper[4907]: I1129 15:05:16.139153 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv"] Nov 29 15:05:16 crc kubenswrapper[4907]: W1129 15:05:16.143469 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17f0e654_fdc3_4250_9e2e_bf7cb21e7175.slice/crio-87d7e7d2d085657ccbe51a33554ef89a4f29b0dab4c0b572568495a3e5ba9037 WatchSource:0}: Error finding container 87d7e7d2d085657ccbe51a33554ef89a4f29b0dab4c0b572568495a3e5ba9037: Status 404 returned error can't find the container with id 87d7e7d2d085657ccbe51a33554ef89a4f29b0dab4c0b572568495a3e5ba9037 Nov 29 15:05:17 crc kubenswrapper[4907]: I1129 15:05:17.012870 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" event={"ID":"17f0e654-fdc3-4250-9e2e-bf7cb21e7175","Type":"ContainerStarted","Data":"515b06a30a77a775e01e7f42519e90054b75b96fba3fd1ec711faae58472819d"} Nov 29 15:05:17 crc kubenswrapper[4907]: I1129 15:05:17.013164 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" event={"ID":"17f0e654-fdc3-4250-9e2e-bf7cb21e7175","Type":"ContainerStarted","Data":"87d7e7d2d085657ccbe51a33554ef89a4f29b0dab4c0b572568495a3e5ba9037"} Nov 29 15:05:17 crc kubenswrapper[4907]: I1129 15:05:17.038996 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" podStartSLOduration=1.500973281 podStartE2EDuration="2.038970247s" podCreationTimestamp="2025-11-29 15:05:15 +0000 UTC" firstStartedPulling="2025-11-29 15:05:16.147915145 +0000 UTC m=+2214.134752807" lastFinishedPulling="2025-11-29 15:05:16.685912111 +0000 UTC m=+2214.672749773" observedRunningTime="2025-11-29 15:05:17.028846171 +0000 UTC m=+2215.015683833" watchObservedRunningTime="2025-11-29 15:05:17.038970247 +0000 UTC m=+2215.025807929" Nov 29 15:05:24 crc kubenswrapper[4907]: I1129 15:05:24.125009 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kmjqr"] Nov 29 15:05:24 crc kubenswrapper[4907]: I1129 15:05:24.128789 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:05:24 crc kubenswrapper[4907]: I1129 15:05:24.153353 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e75e676-b47f-4633-9011-93f0bbc72b01-catalog-content\") pod \"community-operators-kmjqr\" (UID: \"0e75e676-b47f-4633-9011-93f0bbc72b01\") " pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:05:24 crc kubenswrapper[4907]: I1129 15:05:24.153729 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gkdc\" (UniqueName: \"kubernetes.io/projected/0e75e676-b47f-4633-9011-93f0bbc72b01-kube-api-access-2gkdc\") pod \"community-operators-kmjqr\" (UID: \"0e75e676-b47f-4633-9011-93f0bbc72b01\") " pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:05:24 crc kubenswrapper[4907]: I1129 15:05:24.153836 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e75e676-b47f-4633-9011-93f0bbc72b01-utilities\") pod \"community-operators-kmjqr\" (UID: \"0e75e676-b47f-4633-9011-93f0bbc72b01\") " pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:05:24 crc kubenswrapper[4907]: I1129 15:05:24.162311 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kmjqr"] Nov 29 15:05:24 crc kubenswrapper[4907]: I1129 15:05:24.255003 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gkdc\" (UniqueName: \"kubernetes.io/projected/0e75e676-b47f-4633-9011-93f0bbc72b01-kube-api-access-2gkdc\") pod \"community-operators-kmjqr\" (UID: \"0e75e676-b47f-4633-9011-93f0bbc72b01\") " pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:05:24 crc kubenswrapper[4907]: I1129 15:05:24.255092 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e75e676-b47f-4633-9011-93f0bbc72b01-utilities\") pod \"community-operators-kmjqr\" (UID: \"0e75e676-b47f-4633-9011-93f0bbc72b01\") " pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:05:24 crc kubenswrapper[4907]: I1129 15:05:24.255142 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e75e676-b47f-4633-9011-93f0bbc72b01-catalog-content\") pod \"community-operators-kmjqr\" (UID: \"0e75e676-b47f-4633-9011-93f0bbc72b01\") " pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:05:24 crc kubenswrapper[4907]: I1129 15:05:24.255701 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e75e676-b47f-4633-9011-93f0bbc72b01-catalog-content\") pod \"community-operators-kmjqr\" (UID: \"0e75e676-b47f-4633-9011-93f0bbc72b01\") " pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:05:24 crc kubenswrapper[4907]: I1129 15:05:24.256204 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e75e676-b47f-4633-9011-93f0bbc72b01-utilities\") pod \"community-operators-kmjqr\" (UID: \"0e75e676-b47f-4633-9011-93f0bbc72b01\") " pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:05:24 crc kubenswrapper[4907]: I1129 15:05:24.288761 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gkdc\" (UniqueName: \"kubernetes.io/projected/0e75e676-b47f-4633-9011-93f0bbc72b01-kube-api-access-2gkdc\") pod \"community-operators-kmjqr\" (UID: \"0e75e676-b47f-4633-9011-93f0bbc72b01\") " pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:05:24 crc kubenswrapper[4907]: I1129 15:05:24.464152 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:05:24 crc kubenswrapper[4907]: I1129 15:05:24.978571 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kmjqr"] Nov 29 15:05:24 crc kubenswrapper[4907]: W1129 15:05:24.978684 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e75e676_b47f_4633_9011_93f0bbc72b01.slice/crio-2647b1fa7ba38971979282ad56e1c44a6db24b39751a33b46b361a45a38b203c WatchSource:0}: Error finding container 2647b1fa7ba38971979282ad56e1c44a6db24b39751a33b46b361a45a38b203c: Status 404 returned error can't find the container with id 2647b1fa7ba38971979282ad56e1c44a6db24b39751a33b46b361a45a38b203c Nov 29 15:05:25 crc kubenswrapper[4907]: I1129 15:05:25.128256 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kmjqr" event={"ID":"0e75e676-b47f-4633-9011-93f0bbc72b01","Type":"ContainerStarted","Data":"2647b1fa7ba38971979282ad56e1c44a6db24b39751a33b46b361a45a38b203c"} Nov 29 15:05:26 crc kubenswrapper[4907]: I1129 15:05:26.143742 4907 generic.go:334] "Generic (PLEG): container finished" podID="0e75e676-b47f-4633-9011-93f0bbc72b01" containerID="6f85752a38cce831f2feb7ff151dc5ce32ed1f521d60ea7fc50b9badb7607a73" exitCode=0 Nov 29 15:05:26 crc kubenswrapper[4907]: I1129 15:05:26.143815 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kmjqr" event={"ID":"0e75e676-b47f-4633-9011-93f0bbc72b01","Type":"ContainerDied","Data":"6f85752a38cce831f2feb7ff151dc5ce32ed1f521d60ea7fc50b9badb7607a73"} Nov 29 15:05:26 crc kubenswrapper[4907]: I1129 15:05:26.151239 4907 generic.go:334] "Generic (PLEG): container finished" podID="17f0e654-fdc3-4250-9e2e-bf7cb21e7175" containerID="515b06a30a77a775e01e7f42519e90054b75b96fba3fd1ec711faae58472819d" exitCode=0 Nov 29 15:05:26 crc kubenswrapper[4907]: I1129 15:05:26.151280 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" event={"ID":"17f0e654-fdc3-4250-9e2e-bf7cb21e7175","Type":"ContainerDied","Data":"515b06a30a77a775e01e7f42519e90054b75b96fba3fd1ec711faae58472819d"} Nov 29 15:05:27 crc kubenswrapper[4907]: I1129 15:05:27.670526 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" Nov 29 15:05:27 crc kubenswrapper[4907]: I1129 15:05:27.848064 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt7q6\" (UniqueName: \"kubernetes.io/projected/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-kube-api-access-pt7q6\") pod \"17f0e654-fdc3-4250-9e2e-bf7cb21e7175\" (UID: \"17f0e654-fdc3-4250-9e2e-bf7cb21e7175\") " Nov 29 15:05:27 crc kubenswrapper[4907]: I1129 15:05:27.848188 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-inventory\") pod \"17f0e654-fdc3-4250-9e2e-bf7cb21e7175\" (UID: \"17f0e654-fdc3-4250-9e2e-bf7cb21e7175\") " Nov 29 15:05:27 crc kubenswrapper[4907]: I1129 15:05:27.848281 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-ssh-key\") pod \"17f0e654-fdc3-4250-9e2e-bf7cb21e7175\" (UID: \"17f0e654-fdc3-4250-9e2e-bf7cb21e7175\") " Nov 29 15:05:27 crc kubenswrapper[4907]: I1129 15:05:27.854227 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-kube-api-access-pt7q6" (OuterVolumeSpecName: "kube-api-access-pt7q6") pod "17f0e654-fdc3-4250-9e2e-bf7cb21e7175" (UID: "17f0e654-fdc3-4250-9e2e-bf7cb21e7175"). InnerVolumeSpecName "kube-api-access-pt7q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:05:27 crc kubenswrapper[4907]: I1129 15:05:27.887200 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-inventory" (OuterVolumeSpecName: "inventory") pod "17f0e654-fdc3-4250-9e2e-bf7cb21e7175" (UID: "17f0e654-fdc3-4250-9e2e-bf7cb21e7175"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:05:27 crc kubenswrapper[4907]: I1129 15:05:27.894373 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "17f0e654-fdc3-4250-9e2e-bf7cb21e7175" (UID: "17f0e654-fdc3-4250-9e2e-bf7cb21e7175"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:05:27 crc kubenswrapper[4907]: I1129 15:05:27.951586 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt7q6\" (UniqueName: \"kubernetes.io/projected/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-kube-api-access-pt7q6\") on node \"crc\" DevicePath \"\"" Nov 29 15:05:27 crc kubenswrapper[4907]: I1129 15:05:27.951617 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 15:05:27 crc kubenswrapper[4907]: I1129 15:05:27.951627 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17f0e654-fdc3-4250-9e2e-bf7cb21e7175-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.173256 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" event={"ID":"17f0e654-fdc3-4250-9e2e-bf7cb21e7175","Type":"ContainerDied","Data":"87d7e7d2d085657ccbe51a33554ef89a4f29b0dab4c0b572568495a3e5ba9037"} Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.173314 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87d7e7d2d085657ccbe51a33554ef89a4f29b0dab4c0b572568495a3e5ba9037" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.173337 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d9sbv" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.284456 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd"] Nov 29 15:05:28 crc kubenswrapper[4907]: E1129 15:05:28.285366 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f0e654-fdc3-4250-9e2e-bf7cb21e7175" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.285706 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f0e654-fdc3-4250-9e2e-bf7cb21e7175" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.285985 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="17f0e654-fdc3-4250-9e2e-bf7cb21e7175" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.286790 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.290116 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.291968 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.293102 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.299334 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.315470 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd"] Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.463578 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd\" (UID: \"ad0ddb73-0774-41ea-999b-a915a2d0f5cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.463637 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd\" (UID: \"ad0ddb73-0774-41ea-999b-a915a2d0f5cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.463674 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfbch\" (UniqueName: \"kubernetes.io/projected/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-kube-api-access-cfbch\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd\" (UID: \"ad0ddb73-0774-41ea-999b-a915a2d0f5cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.490477 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.490529 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.565673 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd\" (UID: \"ad0ddb73-0774-41ea-999b-a915a2d0f5cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.565760 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd\" (UID: \"ad0ddb73-0774-41ea-999b-a915a2d0f5cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.565799 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfbch\" (UniqueName: \"kubernetes.io/projected/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-kube-api-access-cfbch\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd\" (UID: \"ad0ddb73-0774-41ea-999b-a915a2d0f5cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.582300 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd\" (UID: \"ad0ddb73-0774-41ea-999b-a915a2d0f5cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.583019 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd\" (UID: \"ad0ddb73-0774-41ea-999b-a915a2d0f5cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.591349 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfbch\" (UniqueName: \"kubernetes.io/projected/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-kube-api-access-cfbch\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd\" (UID: \"ad0ddb73-0774-41ea-999b-a915a2d0f5cd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" Nov 29 15:05:28 crc kubenswrapper[4907]: I1129 15:05:28.616559 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" Nov 29 15:05:30 crc kubenswrapper[4907]: I1129 15:05:30.475822 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd"] Nov 29 15:05:30 crc kubenswrapper[4907]: W1129 15:05:30.511492 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad0ddb73_0774_41ea_999b_a915a2d0f5cd.slice/crio-6699a1015d296b37dde88327c7dde22aec11f1541d99998f3b305997c016d844 WatchSource:0}: Error finding container 6699a1015d296b37dde88327c7dde22aec11f1541d99998f3b305997c016d844: Status 404 returned error can't find the container with id 6699a1015d296b37dde88327c7dde22aec11f1541d99998f3b305997c016d844 Nov 29 15:05:31 crc kubenswrapper[4907]: I1129 15:05:31.215603 4907 generic.go:334] "Generic (PLEG): container finished" podID="0e75e676-b47f-4633-9011-93f0bbc72b01" containerID="2a55120f778052f1c244d0ce6c0f2368a6b6f51f3fb03deb8bd60c0e8bafc39b" exitCode=0 Nov 29 15:05:31 crc kubenswrapper[4907]: I1129 15:05:31.215668 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kmjqr" event={"ID":"0e75e676-b47f-4633-9011-93f0bbc72b01","Type":"ContainerDied","Data":"2a55120f778052f1c244d0ce6c0f2368a6b6f51f3fb03deb8bd60c0e8bafc39b"} Nov 29 15:05:31 crc kubenswrapper[4907]: I1129 15:05:31.219865 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" event={"ID":"ad0ddb73-0774-41ea-999b-a915a2d0f5cd","Type":"ContainerStarted","Data":"6699a1015d296b37dde88327c7dde22aec11f1541d99998f3b305997c016d844"} Nov 29 15:05:32 crc kubenswrapper[4907]: I1129 15:05:32.253569 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kmjqr" event={"ID":"0e75e676-b47f-4633-9011-93f0bbc72b01","Type":"ContainerStarted","Data":"a4d3f3bd14cb5f398459a14cdd1eac0eacac6c7530f585639e598d97bf144e43"} Nov 29 15:05:32 crc kubenswrapper[4907]: I1129 15:05:32.256486 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" event={"ID":"ad0ddb73-0774-41ea-999b-a915a2d0f5cd","Type":"ContainerStarted","Data":"f250731ca61d92515246976a82678b445b505e87d97e6b3517f106e892031ae6"} Nov 29 15:05:32 crc kubenswrapper[4907]: I1129 15:05:32.302860 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kmjqr" podStartSLOduration=2.805166012 podStartE2EDuration="8.302831457s" podCreationTimestamp="2025-11-29 15:05:24 +0000 UTC" firstStartedPulling="2025-11-29 15:05:26.147326292 +0000 UTC m=+2224.134163974" lastFinishedPulling="2025-11-29 15:05:31.644991757 +0000 UTC m=+2229.631829419" observedRunningTime="2025-11-29 15:05:32.284731326 +0000 UTC m=+2230.271569088" watchObservedRunningTime="2025-11-29 15:05:32.302831457 +0000 UTC m=+2230.289669139" Nov 29 15:05:32 crc kubenswrapper[4907]: I1129 15:05:32.323567 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" podStartSLOduration=3.852072152 podStartE2EDuration="4.323546351s" podCreationTimestamp="2025-11-29 15:05:28 +0000 UTC" firstStartedPulling="2025-11-29 15:05:30.51357884 +0000 UTC m=+2228.500416522" lastFinishedPulling="2025-11-29 15:05:30.985053059 +0000 UTC m=+2228.971890721" observedRunningTime="2025-11-29 15:05:32.313319633 +0000 UTC m=+2230.300157395" watchObservedRunningTime="2025-11-29 15:05:32.323546351 +0000 UTC m=+2230.310384013" Nov 29 15:05:34 crc kubenswrapper[4907]: I1129 15:05:34.464400 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:05:34 crc kubenswrapper[4907]: I1129 15:05:34.465046 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:05:34 crc kubenswrapper[4907]: I1129 15:05:34.537306 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:05:38 crc kubenswrapper[4907]: I1129 15:05:38.063649 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-vz6m5"] Nov 29 15:05:38 crc kubenswrapper[4907]: I1129 15:05:38.075483 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-vz6m5"] Nov 29 15:05:38 crc kubenswrapper[4907]: I1129 15:05:38.501615 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cc96345-5c70-4c46-8ec2-8c53e2fe35aa" path="/var/lib/kubelet/pods/4cc96345-5c70-4c46-8ec2-8c53e2fe35aa/volumes" Nov 29 15:05:41 crc kubenswrapper[4907]: I1129 15:05:41.374732 4907 generic.go:334] "Generic (PLEG): container finished" podID="ad0ddb73-0774-41ea-999b-a915a2d0f5cd" containerID="f250731ca61d92515246976a82678b445b505e87d97e6b3517f106e892031ae6" exitCode=0 Nov 29 15:05:41 crc kubenswrapper[4907]: I1129 15:05:41.374819 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" event={"ID":"ad0ddb73-0774-41ea-999b-a915a2d0f5cd","Type":"ContainerDied","Data":"f250731ca61d92515246976a82678b445b505e87d97e6b3517f106e892031ae6"} Nov 29 15:05:42 crc kubenswrapper[4907]: I1129 15:05:42.902479 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.069729 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbch\" (UniqueName: \"kubernetes.io/projected/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-kube-api-access-cfbch\") pod \"ad0ddb73-0774-41ea-999b-a915a2d0f5cd\" (UID: \"ad0ddb73-0774-41ea-999b-a915a2d0f5cd\") " Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.070189 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-inventory\") pod \"ad0ddb73-0774-41ea-999b-a915a2d0f5cd\" (UID: \"ad0ddb73-0774-41ea-999b-a915a2d0f5cd\") " Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.070850 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-ssh-key\") pod \"ad0ddb73-0774-41ea-999b-a915a2d0f5cd\" (UID: \"ad0ddb73-0774-41ea-999b-a915a2d0f5cd\") " Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.082799 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-kube-api-access-cfbch" (OuterVolumeSpecName: "kube-api-access-cfbch") pod "ad0ddb73-0774-41ea-999b-a915a2d0f5cd" (UID: "ad0ddb73-0774-41ea-999b-a915a2d0f5cd"). InnerVolumeSpecName "kube-api-access-cfbch". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.129149 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-inventory" (OuterVolumeSpecName: "inventory") pod "ad0ddb73-0774-41ea-999b-a915a2d0f5cd" (UID: "ad0ddb73-0774-41ea-999b-a915a2d0f5cd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.135504 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ad0ddb73-0774-41ea-999b-a915a2d0f5cd" (UID: "ad0ddb73-0774-41ea-999b-a915a2d0f5cd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.174614 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.174972 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbch\" (UniqueName: \"kubernetes.io/projected/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-kube-api-access-cfbch\") on node \"crc\" DevicePath \"\"" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.175140 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad0ddb73-0774-41ea-999b-a915a2d0f5cd-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.431260 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" event={"ID":"ad0ddb73-0774-41ea-999b-a915a2d0f5cd","Type":"ContainerDied","Data":"6699a1015d296b37dde88327c7dde22aec11f1541d99998f3b305997c016d844"} Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.431747 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6699a1015d296b37dde88327c7dde22aec11f1541d99998f3b305997c016d844" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.431891 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.513987 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn"] Nov 29 15:05:43 crc kubenswrapper[4907]: E1129 15:05:43.515216 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad0ddb73-0774-41ea-999b-a915a2d0f5cd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.515265 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0ddb73-0774-41ea-999b-a915a2d0f5cd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.515944 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad0ddb73-0774-41ea-999b-a915a2d0f5cd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.518080 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.520683 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.520697 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.520956 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.521120 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.521278 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.521423 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.521593 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.521869 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.521930 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.530085 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn"] Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.692370 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.692571 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.692807 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.692924 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.693006 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd9gz\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-kube-api-access-xd9gz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.693130 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.693259 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.693315 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.693385 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.693546 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.693789 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.693990 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.694229 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.694555 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.694788 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.694994 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.797548 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.797644 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.797671 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.797715 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.797913 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.798589 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.798659 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.798697 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd9gz\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-kube-api-access-xd9gz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.798741 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.798797 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.798843 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.798892 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.798952 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.798979 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.799023 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.799058 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.805818 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.806495 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.806770 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.807086 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.807347 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.807404 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.808272 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.808276 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.808484 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.808599 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.809532 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.813164 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.815525 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.818365 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.822808 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.827040 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd9gz\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-kube-api-access-xd9gz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-l64sn\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:43 crc kubenswrapper[4907]: I1129 15:05:43.842607 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:05:44 crc kubenswrapper[4907]: I1129 15:05:44.517398 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn"] Nov 29 15:05:44 crc kubenswrapper[4907]: I1129 15:05:44.547529 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:05:44 crc kubenswrapper[4907]: I1129 15:05:44.646625 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kmjqr"] Nov 29 15:05:44 crc kubenswrapper[4907]: I1129 15:05:44.709610 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m2p45"] Nov 29 15:05:44 crc kubenswrapper[4907]: I1129 15:05:44.709901 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m2p45" podUID="a0c4e613-dc17-42e5-ba34-58c07d69b3a0" containerName="registry-server" containerID="cri-o://173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a" gracePeriod=2 Nov 29 15:05:45 crc kubenswrapper[4907]: E1129 15:05:45.020414 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a is running failed: container process not found" containerID="173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a" cmd=["grpc_health_probe","-addr=:50051"] Nov 29 15:05:45 crc kubenswrapper[4907]: E1129 15:05:45.021057 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a is running failed: container process not found" containerID="173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a" cmd=["grpc_health_probe","-addr=:50051"] Nov 29 15:05:45 crc kubenswrapper[4907]: E1129 15:05:45.021252 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a is running failed: container process not found" containerID="173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a" cmd=["grpc_health_probe","-addr=:50051"] Nov 29 15:05:45 crc kubenswrapper[4907]: E1129 15:05:45.021279 4907 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-m2p45" podUID="a0c4e613-dc17-42e5-ba34-58c07d69b3a0" containerName="registry-server" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.264186 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2p45" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.438179 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-utilities\") pod \"a0c4e613-dc17-42e5-ba34-58c07d69b3a0\" (UID: \"a0c4e613-dc17-42e5-ba34-58c07d69b3a0\") " Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.438735 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9nxt\" (UniqueName: \"kubernetes.io/projected/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-kube-api-access-x9nxt\") pod \"a0c4e613-dc17-42e5-ba34-58c07d69b3a0\" (UID: \"a0c4e613-dc17-42e5-ba34-58c07d69b3a0\") " Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.438989 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-catalog-content\") pod \"a0c4e613-dc17-42e5-ba34-58c07d69b3a0\" (UID: \"a0c4e613-dc17-42e5-ba34-58c07d69b3a0\") " Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.439844 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-utilities" (OuterVolumeSpecName: "utilities") pod "a0c4e613-dc17-42e5-ba34-58c07d69b3a0" (UID: "a0c4e613-dc17-42e5-ba34-58c07d69b3a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.445943 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-kube-api-access-x9nxt" (OuterVolumeSpecName: "kube-api-access-x9nxt") pod "a0c4e613-dc17-42e5-ba34-58c07d69b3a0" (UID: "a0c4e613-dc17-42e5-ba34-58c07d69b3a0"). InnerVolumeSpecName "kube-api-access-x9nxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.458376 4907 generic.go:334] "Generic (PLEG): container finished" podID="a0c4e613-dc17-42e5-ba34-58c07d69b3a0" containerID="173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a" exitCode=0 Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.458430 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2p45" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.458424 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2p45" event={"ID":"a0c4e613-dc17-42e5-ba34-58c07d69b3a0","Type":"ContainerDied","Data":"173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a"} Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.458531 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2p45" event={"ID":"a0c4e613-dc17-42e5-ba34-58c07d69b3a0","Type":"ContainerDied","Data":"21a36d66c4fc558284051ba091f42bbd44ce37d4e5fee5343b8ab95aaa61601a"} Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.458550 4907 scope.go:117] "RemoveContainer" containerID="173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.459688 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" event={"ID":"720a0e4a-f20d-401e-9c04-fd8c001281c3","Type":"ContainerStarted","Data":"b9500ae3370314f1f1b75a35acb1c8bdc2b42aa06d5871dfb1c405046a54abe9"} Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.459723 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" event={"ID":"720a0e4a-f20d-401e-9c04-fd8c001281c3","Type":"ContainerStarted","Data":"f81c7c371c0ce7ed0ff8a17e8cf4e351f33978ee03256927a3f0ec283880125b"} Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.476409 4907 scope.go:117] "RemoveContainer" containerID="f2b40057eba4bae848aece7cc28a0728b2c0dfea68b95a58555e36f8950ba2af" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.493040 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" podStartSLOduration=2.046705283 podStartE2EDuration="2.493025412s" podCreationTimestamp="2025-11-29 15:05:43 +0000 UTC" firstStartedPulling="2025-11-29 15:05:44.529845833 +0000 UTC m=+2242.516683515" lastFinishedPulling="2025-11-29 15:05:44.976166002 +0000 UTC m=+2242.963003644" observedRunningTime="2025-11-29 15:05:45.489682747 +0000 UTC m=+2243.476520399" watchObservedRunningTime="2025-11-29 15:05:45.493025412 +0000 UTC m=+2243.479863064" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.499765 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0c4e613-dc17-42e5-ba34-58c07d69b3a0" (UID: "a0c4e613-dc17-42e5-ba34-58c07d69b3a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.528602 4907 scope.go:117] "RemoveContainer" containerID="022bb455f0cf005c3b8f432c116afc990671d671eff7f59e9dc4be1464bbd7ef" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.541934 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.542063 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.542135 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9nxt\" (UniqueName: \"kubernetes.io/projected/a0c4e613-dc17-42e5-ba34-58c07d69b3a0-kube-api-access-x9nxt\") on node \"crc\" DevicePath \"\"" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.552610 4907 scope.go:117] "RemoveContainer" containerID="173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a" Nov 29 15:05:45 crc kubenswrapper[4907]: E1129 15:05:45.552964 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a\": container with ID starting with 173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a not found: ID does not exist" containerID="173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.553001 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a"} err="failed to get container status \"173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a\": rpc error: code = NotFound desc = could not find container \"173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a\": container with ID starting with 173487a668ee3677b72f69c98add4faa9f7bd338eee87e1be1c85c72070b598a not found: ID does not exist" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.553020 4907 scope.go:117] "RemoveContainer" containerID="f2b40057eba4bae848aece7cc28a0728b2c0dfea68b95a58555e36f8950ba2af" Nov 29 15:05:45 crc kubenswrapper[4907]: E1129 15:05:45.553202 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b40057eba4bae848aece7cc28a0728b2c0dfea68b95a58555e36f8950ba2af\": container with ID starting with f2b40057eba4bae848aece7cc28a0728b2c0dfea68b95a58555e36f8950ba2af not found: ID does not exist" containerID="f2b40057eba4bae848aece7cc28a0728b2c0dfea68b95a58555e36f8950ba2af" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.553217 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b40057eba4bae848aece7cc28a0728b2c0dfea68b95a58555e36f8950ba2af"} err="failed to get container status \"f2b40057eba4bae848aece7cc28a0728b2c0dfea68b95a58555e36f8950ba2af\": rpc error: code = NotFound desc = could not find container \"f2b40057eba4bae848aece7cc28a0728b2c0dfea68b95a58555e36f8950ba2af\": container with ID starting with f2b40057eba4bae848aece7cc28a0728b2c0dfea68b95a58555e36f8950ba2af not found: ID does not exist" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.553228 4907 scope.go:117] "RemoveContainer" containerID="022bb455f0cf005c3b8f432c116afc990671d671eff7f59e9dc4be1464bbd7ef" Nov 29 15:05:45 crc kubenswrapper[4907]: E1129 15:05:45.553397 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"022bb455f0cf005c3b8f432c116afc990671d671eff7f59e9dc4be1464bbd7ef\": container with ID starting with 022bb455f0cf005c3b8f432c116afc990671d671eff7f59e9dc4be1464bbd7ef not found: ID does not exist" containerID="022bb455f0cf005c3b8f432c116afc990671d671eff7f59e9dc4be1464bbd7ef" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.553413 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"022bb455f0cf005c3b8f432c116afc990671d671eff7f59e9dc4be1464bbd7ef"} err="failed to get container status \"022bb455f0cf005c3b8f432c116afc990671d671eff7f59e9dc4be1464bbd7ef\": rpc error: code = NotFound desc = could not find container \"022bb455f0cf005c3b8f432c116afc990671d671eff7f59e9dc4be1464bbd7ef\": container with ID starting with 022bb455f0cf005c3b8f432c116afc990671d671eff7f59e9dc4be1464bbd7ef not found: ID does not exist" Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.800138 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m2p45"] Nov 29 15:05:45 crc kubenswrapper[4907]: I1129 15:05:45.812869 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m2p45"] Nov 29 15:05:46 crc kubenswrapper[4907]: I1129 15:05:46.511012 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0c4e613-dc17-42e5-ba34-58c07d69b3a0" path="/var/lib/kubelet/pods/a0c4e613-dc17-42e5-ba34-58c07d69b3a0/volumes" Nov 29 15:05:58 crc kubenswrapper[4907]: I1129 15:05:58.490067 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:05:58 crc kubenswrapper[4907]: I1129 15:05:58.490975 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:06:18 crc kubenswrapper[4907]: I1129 15:06:18.063265 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-rqzth"] Nov 29 15:06:18 crc kubenswrapper[4907]: I1129 15:06:18.075982 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-rqzth"] Nov 29 15:06:18 crc kubenswrapper[4907]: I1129 15:06:18.495725 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a59cb670-d91b-41d5-a529-305c35b1bd12" path="/var/lib/kubelet/pods/a59cb670-d91b-41d5-a529-305c35b1bd12/volumes" Nov 29 15:06:28 crc kubenswrapper[4907]: I1129 15:06:28.489762 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:06:28 crc kubenswrapper[4907]: I1129 15:06:28.490403 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:06:28 crc kubenswrapper[4907]: I1129 15:06:28.494594 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 15:06:28 crc kubenswrapper[4907]: I1129 15:06:28.495657 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 15:06:28 crc kubenswrapper[4907]: I1129 15:06:28.495721 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" gracePeriod=600 Nov 29 15:06:28 crc kubenswrapper[4907]: E1129 15:06:28.630493 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:06:28 crc kubenswrapper[4907]: I1129 15:06:28.940032 4907 scope.go:117] "RemoveContainer" containerID="a814bf3a9083fc8e446abdf6f7c2e0a65168cbffe17329514089e65fa26384a8" Nov 29 15:06:28 crc kubenswrapper[4907]: I1129 15:06:28.975769 4907 scope.go:117] "RemoveContainer" containerID="26cfe9ef18f81683b9a4dd6218379a9f055a8cd54775f69ad0055a6716b0d29b" Nov 29 15:06:29 crc kubenswrapper[4907]: I1129 15:06:29.092658 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" exitCode=0 Nov 29 15:06:29 crc kubenswrapper[4907]: I1129 15:06:29.092698 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7"} Nov 29 15:06:29 crc kubenswrapper[4907]: I1129 15:06:29.092727 4907 scope.go:117] "RemoveContainer" containerID="3f223864d0e084019513cae274c0d44a852a5d1a01e7ee167f1727b3298f32cf" Nov 29 15:06:29 crc kubenswrapper[4907]: I1129 15:06:29.093990 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:06:29 crc kubenswrapper[4907]: E1129 15:06:29.094677 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:06:37 crc kubenswrapper[4907]: I1129 15:06:37.238176 4907 generic.go:334] "Generic (PLEG): container finished" podID="720a0e4a-f20d-401e-9c04-fd8c001281c3" containerID="b9500ae3370314f1f1b75a35acb1c8bdc2b42aa06d5871dfb1c405046a54abe9" exitCode=0 Nov 29 15:06:37 crc kubenswrapper[4907]: I1129 15:06:37.238511 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" event={"ID":"720a0e4a-f20d-401e-9c04-fd8c001281c3","Type":"ContainerDied","Data":"b9500ae3370314f1f1b75a35acb1c8bdc2b42aa06d5871dfb1c405046a54abe9"} Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.846146 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.958708 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.959186 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-ovn-combined-ca-bundle\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.959262 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.959355 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-bootstrap-combined-ca-bundle\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.959432 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.959501 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-telemetry-power-monitoring-combined-ca-bundle\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.959620 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd9gz\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-kube-api-access-xd9gz\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.959673 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.959723 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-inventory\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.959869 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-repo-setup-combined-ca-bundle\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.959916 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-libvirt-combined-ca-bundle\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.961418 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-neutron-metadata-combined-ca-bundle\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.961515 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-ssh-key\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.961611 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.961657 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-nova-combined-ca-bundle\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.961731 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-telemetry-combined-ca-bundle\") pod \"720a0e4a-f20d-401e-9c04-fd8c001281c3\" (UID: \"720a0e4a-f20d-401e-9c04-fd8c001281c3\") " Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.964663 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.966515 4907 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.966534 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.969459 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.969493 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.971372 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.971518 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.972221 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.972333 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.972974 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.973694 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.975869 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-kube-api-access-xd9gz" (OuterVolumeSpecName: "kube-api-access-xd9gz") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "kube-api-access-xd9gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.978825 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.979046 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:06:38 crc kubenswrapper[4907]: I1129 15:06:38.979673 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.000725 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.025196 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-inventory" (OuterVolumeSpecName: "inventory") pod "720a0e4a-f20d-401e-9c04-fd8c001281c3" (UID: "720a0e4a-f20d-401e-9c04-fd8c001281c3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.068576 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd9gz\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-kube-api-access-xd9gz\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.068793 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.068909 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.069165 4907 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.069245 4907 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.069322 4907 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.069397 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.069506 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.069599 4907 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.069679 4907 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.069763 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.070217 4907 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.070328 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.070421 4907 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/720a0e4a-f20d-401e-9c04-fd8c001281c3-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.070532 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/720a0e4a-f20d-401e-9c04-fd8c001281c3-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.266705 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" event={"ID":"720a0e4a-f20d-401e-9c04-fd8c001281c3","Type":"ContainerDied","Data":"f81c7c371c0ce7ed0ff8a17e8cf4e351f33978ee03256927a3f0ec283880125b"} Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.266739 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f81c7c371c0ce7ed0ff8a17e8cf4e351f33978ee03256927a3f0ec283880125b" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.266801 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-l64sn" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.384859 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf"] Nov 29 15:06:39 crc kubenswrapper[4907]: E1129 15:06:39.385419 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720a0e4a-f20d-401e-9c04-fd8c001281c3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.385449 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="720a0e4a-f20d-401e-9c04-fd8c001281c3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 29 15:06:39 crc kubenswrapper[4907]: E1129 15:06:39.385457 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c4e613-dc17-42e5-ba34-58c07d69b3a0" containerName="extract-content" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.385464 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c4e613-dc17-42e5-ba34-58c07d69b3a0" containerName="extract-content" Nov 29 15:06:39 crc kubenswrapper[4907]: E1129 15:06:39.385498 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c4e613-dc17-42e5-ba34-58c07d69b3a0" containerName="extract-utilities" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.385537 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c4e613-dc17-42e5-ba34-58c07d69b3a0" containerName="extract-utilities" Nov 29 15:06:39 crc kubenswrapper[4907]: E1129 15:06:39.385572 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c4e613-dc17-42e5-ba34-58c07d69b3a0" containerName="registry-server" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.385578 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c4e613-dc17-42e5-ba34-58c07d69b3a0" containerName="registry-server" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.385789 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="720a0e4a-f20d-401e-9c04-fd8c001281c3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.385820 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c4e613-dc17-42e5-ba34-58c07d69b3a0" containerName="registry-server" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.386667 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.388735 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.388908 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.389044 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.389152 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.389892 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.401717 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf"] Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.479266 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-88bdf\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.479936 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-88bdf\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.479971 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gghjx\" (UniqueName: \"kubernetes.io/projected/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-kube-api-access-gghjx\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-88bdf\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.479990 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-88bdf\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.480779 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-88bdf\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.582198 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-88bdf\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.582256 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-88bdf\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.582284 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-88bdf\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.582304 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gghjx\" (UniqueName: \"kubernetes.io/projected/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-kube-api-access-gghjx\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-88bdf\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.582321 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-88bdf\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.583346 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-88bdf\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.589743 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-88bdf\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.592635 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-88bdf\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.594377 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-88bdf\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.602332 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gghjx\" (UniqueName: \"kubernetes.io/projected/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-kube-api-access-gghjx\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-88bdf\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:39 crc kubenswrapper[4907]: I1129 15:06:39.703859 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:06:40 crc kubenswrapper[4907]: I1129 15:06:40.335971 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf"] Nov 29 15:06:40 crc kubenswrapper[4907]: W1129 15:06:40.343355 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0dbf497_7f0c_4aaf_841d_7abbe8299bd9.slice/crio-db39c9c90d276bacc445a29fac4f7d3bd8aefb00db9cb8350bc58274f91adac5 WatchSource:0}: Error finding container db39c9c90d276bacc445a29fac4f7d3bd8aefb00db9cb8350bc58274f91adac5: Status 404 returned error can't find the container with id db39c9c90d276bacc445a29fac4f7d3bd8aefb00db9cb8350bc58274f91adac5 Nov 29 15:06:41 crc kubenswrapper[4907]: I1129 15:06:41.299881 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" event={"ID":"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9","Type":"ContainerStarted","Data":"db39c9c90d276bacc445a29fac4f7d3bd8aefb00db9cb8350bc58274f91adac5"} Nov 29 15:06:42 crc kubenswrapper[4907]: I1129 15:06:42.311578 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" event={"ID":"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9","Type":"ContainerStarted","Data":"25bc700150a29305a771f0b8c2ae0a3cdcacf588cd91be0cca4081c4c00c27b9"} Nov 29 15:06:42 crc kubenswrapper[4907]: I1129 15:06:42.342764 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" podStartSLOduration=2.666513867 podStartE2EDuration="3.342742514s" podCreationTimestamp="2025-11-29 15:06:39 +0000 UTC" firstStartedPulling="2025-11-29 15:06:40.346287379 +0000 UTC m=+2298.333125041" lastFinishedPulling="2025-11-29 15:06:41.022516016 +0000 UTC m=+2299.009353688" observedRunningTime="2025-11-29 15:06:42.328486681 +0000 UTC m=+2300.315324343" watchObservedRunningTime="2025-11-29 15:06:42.342742514 +0000 UTC m=+2300.329580176" Nov 29 15:06:44 crc kubenswrapper[4907]: I1129 15:06:44.482719 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:06:44 crc kubenswrapper[4907]: E1129 15:06:44.489742 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:06:56 crc kubenswrapper[4907]: I1129 15:06:56.479914 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:06:56 crc kubenswrapper[4907]: E1129 15:06:56.481003 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:07:09 crc kubenswrapper[4907]: I1129 15:07:09.480826 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:07:09 crc kubenswrapper[4907]: E1129 15:07:09.482519 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:07:24 crc kubenswrapper[4907]: I1129 15:07:24.479952 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:07:24 crc kubenswrapper[4907]: E1129 15:07:24.480679 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:07:36 crc kubenswrapper[4907]: I1129 15:07:36.481038 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:07:36 crc kubenswrapper[4907]: E1129 15:07:36.482024 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:07:47 crc kubenswrapper[4907]: I1129 15:07:47.480066 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:07:47 crc kubenswrapper[4907]: E1129 15:07:47.481397 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:07:57 crc kubenswrapper[4907]: I1129 15:07:57.366511 4907 generic.go:334] "Generic (PLEG): container finished" podID="a0dbf497-7f0c-4aaf-841d-7abbe8299bd9" containerID="25bc700150a29305a771f0b8c2ae0a3cdcacf588cd91be0cca4081c4c00c27b9" exitCode=0 Nov 29 15:07:57 crc kubenswrapper[4907]: I1129 15:07:57.366600 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" event={"ID":"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9","Type":"ContainerDied","Data":"25bc700150a29305a771f0b8c2ae0a3cdcacf588cd91be0cca4081c4c00c27b9"} Nov 29 15:07:58 crc kubenswrapper[4907]: I1129 15:07:58.879849 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:07:58 crc kubenswrapper[4907]: I1129 15:07:58.924062 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ovn-combined-ca-bundle\") pod \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " Nov 29 15:07:58 crc kubenswrapper[4907]: I1129 15:07:58.924106 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gghjx\" (UniqueName: \"kubernetes.io/projected/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-kube-api-access-gghjx\") pod \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " Nov 29 15:07:58 crc kubenswrapper[4907]: I1129 15:07:58.924153 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ovncontroller-config-0\") pod \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " Nov 29 15:07:58 crc kubenswrapper[4907]: I1129 15:07:58.924398 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-inventory\") pod \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " Nov 29 15:07:58 crc kubenswrapper[4907]: I1129 15:07:58.924424 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ssh-key\") pod \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\" (UID: \"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9\") " Nov 29 15:07:58 crc kubenswrapper[4907]: I1129 15:07:58.930752 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-kube-api-access-gghjx" (OuterVolumeSpecName: "kube-api-access-gghjx") pod "a0dbf497-7f0c-4aaf-841d-7abbe8299bd9" (UID: "a0dbf497-7f0c-4aaf-841d-7abbe8299bd9"). InnerVolumeSpecName "kube-api-access-gghjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:07:58 crc kubenswrapper[4907]: I1129 15:07:58.932664 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a0dbf497-7f0c-4aaf-841d-7abbe8299bd9" (UID: "a0dbf497-7f0c-4aaf-841d-7abbe8299bd9"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:07:58 crc kubenswrapper[4907]: I1129 15:07:58.960268 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-inventory" (OuterVolumeSpecName: "inventory") pod "a0dbf497-7f0c-4aaf-841d-7abbe8299bd9" (UID: "a0dbf497-7f0c-4aaf-841d-7abbe8299bd9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:07:58 crc kubenswrapper[4907]: I1129 15:07:58.965723 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a0dbf497-7f0c-4aaf-841d-7abbe8299bd9" (UID: "a0dbf497-7f0c-4aaf-841d-7abbe8299bd9"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 15:07:58 crc kubenswrapper[4907]: I1129 15:07:58.975821 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a0dbf497-7f0c-4aaf-841d-7abbe8299bd9" (UID: "a0dbf497-7f0c-4aaf-841d-7abbe8299bd9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.027147 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.027186 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.027199 4907 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.027213 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gghjx\" (UniqueName: \"kubernetes.io/projected/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-kube-api-access-gghjx\") on node \"crc\" DevicePath \"\"" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.027226 4907 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0dbf497-7f0c-4aaf-841d-7abbe8299bd9-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.393737 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" event={"ID":"a0dbf497-7f0c-4aaf-841d-7abbe8299bd9","Type":"ContainerDied","Data":"db39c9c90d276bacc445a29fac4f7d3bd8aefb00db9cb8350bc58274f91adac5"} Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.393792 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db39c9c90d276bacc445a29fac4f7d3bd8aefb00db9cb8350bc58274f91adac5" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.393852 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-88bdf" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.543497 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr"] Nov 29 15:07:59 crc kubenswrapper[4907]: E1129 15:07:59.546261 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0dbf497-7f0c-4aaf-841d-7abbe8299bd9" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.546299 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0dbf497-7f0c-4aaf-841d-7abbe8299bd9" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.547095 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0dbf497-7f0c-4aaf-841d-7abbe8299bd9" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.548547 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.558917 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.559204 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.559292 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.559376 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.559506 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.559801 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.565974 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr"] Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.641735 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.642139 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.642396 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.642446 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.642543 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.642592 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcd9t\" (UniqueName: \"kubernetes.io/projected/dd16d4b2-3a9d-4a05-8564-3de313928ab8-kube-api-access-lcd9t\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: E1129 15:07:59.733584 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0dbf497_7f0c_4aaf_841d_7abbe8299bd9.slice\": RecentStats: unable to find data in memory cache]" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.744787 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.744882 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.745014 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.745124 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcd9t\" (UniqueName: \"kubernetes.io/projected/dd16d4b2-3a9d-4a05-8564-3de313928ab8-kube-api-access-lcd9t\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.745274 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.745598 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.750014 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.750124 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.751305 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.757143 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.757209 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.764085 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcd9t\" (UniqueName: \"kubernetes.io/projected/dd16d4b2-3a9d-4a05-8564-3de313928ab8-kube-api-access-lcd9t\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:07:59 crc kubenswrapper[4907]: I1129 15:07:59.872466 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:08:00 crc kubenswrapper[4907]: I1129 15:08:00.594344 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr"] Nov 29 15:08:00 crc kubenswrapper[4907]: I1129 15:08:00.602397 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 15:08:01 crc kubenswrapper[4907]: I1129 15:08:01.418319 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" event={"ID":"dd16d4b2-3a9d-4a05-8564-3de313928ab8","Type":"ContainerStarted","Data":"2b7ff84ae6ced703db096d1d50d39f0cc0e72523d0125cd8885f2056fe77b76f"} Nov 29 15:08:01 crc kubenswrapper[4907]: I1129 15:08:01.480299 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:08:01 crc kubenswrapper[4907]: E1129 15:08:01.480854 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:08:02 crc kubenswrapper[4907]: I1129 15:08:02.450727 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" event={"ID":"dd16d4b2-3a9d-4a05-8564-3de313928ab8","Type":"ContainerStarted","Data":"f20df86f915926cab807231376733a75baa8e7b560a2eccd8398ef27c0c80ed9"} Nov 29 15:08:02 crc kubenswrapper[4907]: I1129 15:08:02.481172 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" podStartSLOduration=2.8279970370000003 podStartE2EDuration="3.481139194s" podCreationTimestamp="2025-11-29 15:07:59 +0000 UTC" firstStartedPulling="2025-11-29 15:08:00.602157865 +0000 UTC m=+2378.588995517" lastFinishedPulling="2025-11-29 15:08:01.255299992 +0000 UTC m=+2379.242137674" observedRunningTime="2025-11-29 15:08:02.472765978 +0000 UTC m=+2380.459603670" watchObservedRunningTime="2025-11-29 15:08:02.481139194 +0000 UTC m=+2380.467976886" Nov 29 15:08:13 crc kubenswrapper[4907]: I1129 15:08:13.480376 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:08:13 crc kubenswrapper[4907]: E1129 15:08:13.481499 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:08:22 crc kubenswrapper[4907]: I1129 15:08:22.623771 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d7pmq"] Nov 29 15:08:22 crc kubenswrapper[4907]: I1129 15:08:22.641891 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:22 crc kubenswrapper[4907]: I1129 15:08:22.647669 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d7pmq"] Nov 29 15:08:22 crc kubenswrapper[4907]: I1129 15:08:22.803360 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jthxj\" (UniqueName: \"kubernetes.io/projected/398328dc-dc65-4290-a5a4-b45a261a40ea-kube-api-access-jthxj\") pod \"certified-operators-d7pmq\" (UID: \"398328dc-dc65-4290-a5a4-b45a261a40ea\") " pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:22 crc kubenswrapper[4907]: I1129 15:08:22.803924 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398328dc-dc65-4290-a5a4-b45a261a40ea-utilities\") pod \"certified-operators-d7pmq\" (UID: \"398328dc-dc65-4290-a5a4-b45a261a40ea\") " pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:22 crc kubenswrapper[4907]: I1129 15:08:22.804587 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398328dc-dc65-4290-a5a4-b45a261a40ea-catalog-content\") pod \"certified-operators-d7pmq\" (UID: \"398328dc-dc65-4290-a5a4-b45a261a40ea\") " pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:22 crc kubenswrapper[4907]: I1129 15:08:22.906793 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398328dc-dc65-4290-a5a4-b45a261a40ea-catalog-content\") pod \"certified-operators-d7pmq\" (UID: \"398328dc-dc65-4290-a5a4-b45a261a40ea\") " pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:22 crc kubenswrapper[4907]: I1129 15:08:22.906879 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jthxj\" (UniqueName: \"kubernetes.io/projected/398328dc-dc65-4290-a5a4-b45a261a40ea-kube-api-access-jthxj\") pod \"certified-operators-d7pmq\" (UID: \"398328dc-dc65-4290-a5a4-b45a261a40ea\") " pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:22 crc kubenswrapper[4907]: I1129 15:08:22.906984 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398328dc-dc65-4290-a5a4-b45a261a40ea-utilities\") pod \"certified-operators-d7pmq\" (UID: \"398328dc-dc65-4290-a5a4-b45a261a40ea\") " pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:22 crc kubenswrapper[4907]: I1129 15:08:22.907831 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398328dc-dc65-4290-a5a4-b45a261a40ea-catalog-content\") pod \"certified-operators-d7pmq\" (UID: \"398328dc-dc65-4290-a5a4-b45a261a40ea\") " pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:22 crc kubenswrapper[4907]: I1129 15:08:22.907950 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398328dc-dc65-4290-a5a4-b45a261a40ea-utilities\") pod \"certified-operators-d7pmq\" (UID: \"398328dc-dc65-4290-a5a4-b45a261a40ea\") " pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:22 crc kubenswrapper[4907]: I1129 15:08:22.933069 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jthxj\" (UniqueName: \"kubernetes.io/projected/398328dc-dc65-4290-a5a4-b45a261a40ea-kube-api-access-jthxj\") pod \"certified-operators-d7pmq\" (UID: \"398328dc-dc65-4290-a5a4-b45a261a40ea\") " pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:22 crc kubenswrapper[4907]: I1129 15:08:22.983379 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:23 crc kubenswrapper[4907]: I1129 15:08:23.530323 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d7pmq"] Nov 29 15:08:23 crc kubenswrapper[4907]: I1129 15:08:23.730011 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7pmq" event={"ID":"398328dc-dc65-4290-a5a4-b45a261a40ea","Type":"ContainerStarted","Data":"f4a0f33e1b8f161454e97dd0421c138c78e4fa1a3b13c351f10819764cafec55"} Nov 29 15:08:24 crc kubenswrapper[4907]: I1129 15:08:24.480553 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:08:24 crc kubenswrapper[4907]: E1129 15:08:24.481037 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:08:24 crc kubenswrapper[4907]: I1129 15:08:24.748108 4907 generic.go:334] "Generic (PLEG): container finished" podID="398328dc-dc65-4290-a5a4-b45a261a40ea" containerID="e341557d290cc13b0fd60a6c4eaa8cc5801d2629eaaf3632815b8862b680691c" exitCode=0 Nov 29 15:08:24 crc kubenswrapper[4907]: I1129 15:08:24.748175 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7pmq" event={"ID":"398328dc-dc65-4290-a5a4-b45a261a40ea","Type":"ContainerDied","Data":"e341557d290cc13b0fd60a6c4eaa8cc5801d2629eaaf3632815b8862b680691c"} Nov 29 15:08:25 crc kubenswrapper[4907]: I1129 15:08:25.762734 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7pmq" event={"ID":"398328dc-dc65-4290-a5a4-b45a261a40ea","Type":"ContainerStarted","Data":"4cb2150c9dcaede29724a8825032c007d823f2bdba5a56873f777bd8e0c055bc"} Nov 29 15:08:26 crc kubenswrapper[4907]: I1129 15:08:26.778605 4907 generic.go:334] "Generic (PLEG): container finished" podID="398328dc-dc65-4290-a5a4-b45a261a40ea" containerID="4cb2150c9dcaede29724a8825032c007d823f2bdba5a56873f777bd8e0c055bc" exitCode=0 Nov 29 15:08:26 crc kubenswrapper[4907]: I1129 15:08:26.778717 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7pmq" event={"ID":"398328dc-dc65-4290-a5a4-b45a261a40ea","Type":"ContainerDied","Data":"4cb2150c9dcaede29724a8825032c007d823f2bdba5a56873f777bd8e0c055bc"} Nov 29 15:08:27 crc kubenswrapper[4907]: I1129 15:08:27.426152 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tjlb8"] Nov 29 15:08:27 crc kubenswrapper[4907]: I1129 15:08:27.430436 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:27 crc kubenswrapper[4907]: I1129 15:08:27.444673 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjlb8"] Nov 29 15:08:27 crc kubenswrapper[4907]: I1129 15:08:27.526374 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a43ab2d-d969-493b-8c18-0e9c90070982-utilities\") pod \"redhat-marketplace-tjlb8\" (UID: \"9a43ab2d-d969-493b-8c18-0e9c90070982\") " pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:27 crc kubenswrapper[4907]: I1129 15:08:27.526684 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxwrj\" (UniqueName: \"kubernetes.io/projected/9a43ab2d-d969-493b-8c18-0e9c90070982-kube-api-access-jxwrj\") pod \"redhat-marketplace-tjlb8\" (UID: \"9a43ab2d-d969-493b-8c18-0e9c90070982\") " pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:27 crc kubenswrapper[4907]: I1129 15:08:27.526784 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a43ab2d-d969-493b-8c18-0e9c90070982-catalog-content\") pod \"redhat-marketplace-tjlb8\" (UID: \"9a43ab2d-d969-493b-8c18-0e9c90070982\") " pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:27 crc kubenswrapper[4907]: I1129 15:08:27.628453 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a43ab2d-d969-493b-8c18-0e9c90070982-utilities\") pod \"redhat-marketplace-tjlb8\" (UID: \"9a43ab2d-d969-493b-8c18-0e9c90070982\") " pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:27 crc kubenswrapper[4907]: I1129 15:08:27.628600 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxwrj\" (UniqueName: \"kubernetes.io/projected/9a43ab2d-d969-493b-8c18-0e9c90070982-kube-api-access-jxwrj\") pod \"redhat-marketplace-tjlb8\" (UID: \"9a43ab2d-d969-493b-8c18-0e9c90070982\") " pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:27 crc kubenswrapper[4907]: I1129 15:08:27.628662 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a43ab2d-d969-493b-8c18-0e9c90070982-catalog-content\") pod \"redhat-marketplace-tjlb8\" (UID: \"9a43ab2d-d969-493b-8c18-0e9c90070982\") " pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:27 crc kubenswrapper[4907]: I1129 15:08:27.628954 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a43ab2d-d969-493b-8c18-0e9c90070982-utilities\") pod \"redhat-marketplace-tjlb8\" (UID: \"9a43ab2d-d969-493b-8c18-0e9c90070982\") " pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:27 crc kubenswrapper[4907]: I1129 15:08:27.629108 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a43ab2d-d969-493b-8c18-0e9c90070982-catalog-content\") pod \"redhat-marketplace-tjlb8\" (UID: \"9a43ab2d-d969-493b-8c18-0e9c90070982\") " pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:27 crc kubenswrapper[4907]: I1129 15:08:27.648217 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxwrj\" (UniqueName: \"kubernetes.io/projected/9a43ab2d-d969-493b-8c18-0e9c90070982-kube-api-access-jxwrj\") pod \"redhat-marketplace-tjlb8\" (UID: \"9a43ab2d-d969-493b-8c18-0e9c90070982\") " pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:27 crc kubenswrapper[4907]: I1129 15:08:27.760102 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:27 crc kubenswrapper[4907]: I1129 15:08:27.807600 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7pmq" event={"ID":"398328dc-dc65-4290-a5a4-b45a261a40ea","Type":"ContainerStarted","Data":"82409b6b898f7ebbb7cba584a959eb47103f0472b2b834cde56f3a9b50fc77a5"} Nov 29 15:08:27 crc kubenswrapper[4907]: I1129 15:08:27.839281 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d7pmq" podStartSLOduration=3.322825066 podStartE2EDuration="5.839262329s" podCreationTimestamp="2025-11-29 15:08:22 +0000 UTC" firstStartedPulling="2025-11-29 15:08:24.752735214 +0000 UTC m=+2402.739572906" lastFinishedPulling="2025-11-29 15:08:27.269172497 +0000 UTC m=+2405.256010169" observedRunningTime="2025-11-29 15:08:27.830094521 +0000 UTC m=+2405.816932173" watchObservedRunningTime="2025-11-29 15:08:27.839262329 +0000 UTC m=+2405.826099981" Nov 29 15:08:28 crc kubenswrapper[4907]: I1129 15:08:28.341043 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjlb8"] Nov 29 15:08:28 crc kubenswrapper[4907]: I1129 15:08:28.818991 4907 generic.go:334] "Generic (PLEG): container finished" podID="9a43ab2d-d969-493b-8c18-0e9c90070982" containerID="cb7d57641a4efeac4c047e70857d3392d4c07798455df675443b2368dc3b86d8" exitCode=0 Nov 29 15:08:28 crc kubenswrapper[4907]: I1129 15:08:28.819087 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjlb8" event={"ID":"9a43ab2d-d969-493b-8c18-0e9c90070982","Type":"ContainerDied","Data":"cb7d57641a4efeac4c047e70857d3392d4c07798455df675443b2368dc3b86d8"} Nov 29 15:08:28 crc kubenswrapper[4907]: I1129 15:08:28.819311 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjlb8" event={"ID":"9a43ab2d-d969-493b-8c18-0e9c90070982","Type":"ContainerStarted","Data":"e5f5a4e490fdd956be59a6d782f6a680d873f05a049fcdb8f5d6ad2d040804ee"} Nov 29 15:08:29 crc kubenswrapper[4907]: I1129 15:08:29.833259 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjlb8" event={"ID":"9a43ab2d-d969-493b-8c18-0e9c90070982","Type":"ContainerStarted","Data":"37b3fdf02793522fccfaa17edb6354653143af87034c87c5e652a6ca2183d3fb"} Nov 29 15:08:30 crc kubenswrapper[4907]: E1129 15:08:30.506119 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a43ab2d_d969_493b_8c18_0e9c90070982.slice/crio-37b3fdf02793522fccfaa17edb6354653143af87034c87c5e652a6ca2183d3fb.scope\": RecentStats: unable to find data in memory cache]" Nov 29 15:08:30 crc kubenswrapper[4907]: I1129 15:08:30.856528 4907 generic.go:334] "Generic (PLEG): container finished" podID="9a43ab2d-d969-493b-8c18-0e9c90070982" containerID="37b3fdf02793522fccfaa17edb6354653143af87034c87c5e652a6ca2183d3fb" exitCode=0 Nov 29 15:08:30 crc kubenswrapper[4907]: I1129 15:08:30.856572 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjlb8" event={"ID":"9a43ab2d-d969-493b-8c18-0e9c90070982","Type":"ContainerDied","Data":"37b3fdf02793522fccfaa17edb6354653143af87034c87c5e652a6ca2183d3fb"} Nov 29 15:08:31 crc kubenswrapper[4907]: I1129 15:08:31.875120 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjlb8" event={"ID":"9a43ab2d-d969-493b-8c18-0e9c90070982","Type":"ContainerStarted","Data":"0c23e4fbd4ee5bf7b03168aa7dee123d30b9c31cfeefcd66b63d31c8b1d319da"} Nov 29 15:08:31 crc kubenswrapper[4907]: I1129 15:08:31.902207 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tjlb8" podStartSLOduration=2.33999806 podStartE2EDuration="4.902178185s" podCreationTimestamp="2025-11-29 15:08:27 +0000 UTC" firstStartedPulling="2025-11-29 15:08:28.820902438 +0000 UTC m=+2406.807740100" lastFinishedPulling="2025-11-29 15:08:31.383082563 +0000 UTC m=+2409.369920225" observedRunningTime="2025-11-29 15:08:31.901373312 +0000 UTC m=+2409.888211004" watchObservedRunningTime="2025-11-29 15:08:31.902178185 +0000 UTC m=+2409.889015867" Nov 29 15:08:32 crc kubenswrapper[4907]: I1129 15:08:32.984388 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:32 crc kubenswrapper[4907]: I1129 15:08:32.984475 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:33 crc kubenswrapper[4907]: I1129 15:08:33.067747 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:33 crc kubenswrapper[4907]: I1129 15:08:33.984766 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.208010 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d7pmq"] Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.208813 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d7pmq" podUID="398328dc-dc65-4290-a5a4-b45a261a40ea" containerName="registry-server" containerID="cri-o://82409b6b898f7ebbb7cba584a959eb47103f0472b2b834cde56f3a9b50fc77a5" gracePeriod=2 Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.756819 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.869862 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398328dc-dc65-4290-a5a4-b45a261a40ea-utilities\") pod \"398328dc-dc65-4290-a5a4-b45a261a40ea\" (UID: \"398328dc-dc65-4290-a5a4-b45a261a40ea\") " Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.870030 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jthxj\" (UniqueName: \"kubernetes.io/projected/398328dc-dc65-4290-a5a4-b45a261a40ea-kube-api-access-jthxj\") pod \"398328dc-dc65-4290-a5a4-b45a261a40ea\" (UID: \"398328dc-dc65-4290-a5a4-b45a261a40ea\") " Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.870055 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398328dc-dc65-4290-a5a4-b45a261a40ea-catalog-content\") pod \"398328dc-dc65-4290-a5a4-b45a261a40ea\" (UID: \"398328dc-dc65-4290-a5a4-b45a261a40ea\") " Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.871311 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398328dc-dc65-4290-a5a4-b45a261a40ea-utilities" (OuterVolumeSpecName: "utilities") pod "398328dc-dc65-4290-a5a4-b45a261a40ea" (UID: "398328dc-dc65-4290-a5a4-b45a261a40ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.875756 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398328dc-dc65-4290-a5a4-b45a261a40ea-kube-api-access-jthxj" (OuterVolumeSpecName: "kube-api-access-jthxj") pod "398328dc-dc65-4290-a5a4-b45a261a40ea" (UID: "398328dc-dc65-4290-a5a4-b45a261a40ea"). InnerVolumeSpecName "kube-api-access-jthxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.934382 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398328dc-dc65-4290-a5a4-b45a261a40ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "398328dc-dc65-4290-a5a4-b45a261a40ea" (UID: "398328dc-dc65-4290-a5a4-b45a261a40ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.950714 4907 generic.go:334] "Generic (PLEG): container finished" podID="398328dc-dc65-4290-a5a4-b45a261a40ea" containerID="82409b6b898f7ebbb7cba584a959eb47103f0472b2b834cde56f3a9b50fc77a5" exitCode=0 Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.950814 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7pmq" event={"ID":"398328dc-dc65-4290-a5a4-b45a261a40ea","Type":"ContainerDied","Data":"82409b6b898f7ebbb7cba584a959eb47103f0472b2b834cde56f3a9b50fc77a5"} Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.950873 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7pmq" event={"ID":"398328dc-dc65-4290-a5a4-b45a261a40ea","Type":"ContainerDied","Data":"f4a0f33e1b8f161454e97dd0421c138c78e4fa1a3b13c351f10819764cafec55"} Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.950897 4907 scope.go:117] "RemoveContainer" containerID="82409b6b898f7ebbb7cba584a959eb47103f0472b2b834cde56f3a9b50fc77a5" Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.951161 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7pmq" Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.972648 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398328dc-dc65-4290-a5a4-b45a261a40ea-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.972679 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jthxj\" (UniqueName: \"kubernetes.io/projected/398328dc-dc65-4290-a5a4-b45a261a40ea-kube-api-access-jthxj\") on node \"crc\" DevicePath \"\"" Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.972690 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398328dc-dc65-4290-a5a4-b45a261a40ea-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.986630 4907 scope.go:117] "RemoveContainer" containerID="4cb2150c9dcaede29724a8825032c007d823f2bdba5a56873f777bd8e0c055bc" Nov 29 15:08:36 crc kubenswrapper[4907]: I1129 15:08:36.991362 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d7pmq"] Nov 29 15:08:37 crc kubenswrapper[4907]: I1129 15:08:37.004899 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d7pmq"] Nov 29 15:08:37 crc kubenswrapper[4907]: I1129 15:08:37.014691 4907 scope.go:117] "RemoveContainer" containerID="e341557d290cc13b0fd60a6c4eaa8cc5801d2629eaaf3632815b8862b680691c" Nov 29 15:08:37 crc kubenswrapper[4907]: I1129 15:08:37.072842 4907 scope.go:117] "RemoveContainer" containerID="82409b6b898f7ebbb7cba584a959eb47103f0472b2b834cde56f3a9b50fc77a5" Nov 29 15:08:37 crc kubenswrapper[4907]: E1129 15:08:37.073210 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82409b6b898f7ebbb7cba584a959eb47103f0472b2b834cde56f3a9b50fc77a5\": container with ID starting with 82409b6b898f7ebbb7cba584a959eb47103f0472b2b834cde56f3a9b50fc77a5 not found: ID does not exist" containerID="82409b6b898f7ebbb7cba584a959eb47103f0472b2b834cde56f3a9b50fc77a5" Nov 29 15:08:37 crc kubenswrapper[4907]: I1129 15:08:37.073263 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82409b6b898f7ebbb7cba584a959eb47103f0472b2b834cde56f3a9b50fc77a5"} err="failed to get container status \"82409b6b898f7ebbb7cba584a959eb47103f0472b2b834cde56f3a9b50fc77a5\": rpc error: code = NotFound desc = could not find container \"82409b6b898f7ebbb7cba584a959eb47103f0472b2b834cde56f3a9b50fc77a5\": container with ID starting with 82409b6b898f7ebbb7cba584a959eb47103f0472b2b834cde56f3a9b50fc77a5 not found: ID does not exist" Nov 29 15:08:37 crc kubenswrapper[4907]: I1129 15:08:37.073307 4907 scope.go:117] "RemoveContainer" containerID="4cb2150c9dcaede29724a8825032c007d823f2bdba5a56873f777bd8e0c055bc" Nov 29 15:08:37 crc kubenswrapper[4907]: E1129 15:08:37.073539 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cb2150c9dcaede29724a8825032c007d823f2bdba5a56873f777bd8e0c055bc\": container with ID starting with 4cb2150c9dcaede29724a8825032c007d823f2bdba5a56873f777bd8e0c055bc not found: ID does not exist" containerID="4cb2150c9dcaede29724a8825032c007d823f2bdba5a56873f777bd8e0c055bc" Nov 29 15:08:37 crc kubenswrapper[4907]: I1129 15:08:37.073561 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb2150c9dcaede29724a8825032c007d823f2bdba5a56873f777bd8e0c055bc"} err="failed to get container status \"4cb2150c9dcaede29724a8825032c007d823f2bdba5a56873f777bd8e0c055bc\": rpc error: code = NotFound desc = could not find container \"4cb2150c9dcaede29724a8825032c007d823f2bdba5a56873f777bd8e0c055bc\": container with ID starting with 4cb2150c9dcaede29724a8825032c007d823f2bdba5a56873f777bd8e0c055bc not found: ID does not exist" Nov 29 15:08:37 crc kubenswrapper[4907]: I1129 15:08:37.073578 4907 scope.go:117] "RemoveContainer" containerID="e341557d290cc13b0fd60a6c4eaa8cc5801d2629eaaf3632815b8862b680691c" Nov 29 15:08:37 crc kubenswrapper[4907]: E1129 15:08:37.073803 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e341557d290cc13b0fd60a6c4eaa8cc5801d2629eaaf3632815b8862b680691c\": container with ID starting with e341557d290cc13b0fd60a6c4eaa8cc5801d2629eaaf3632815b8862b680691c not found: ID does not exist" containerID="e341557d290cc13b0fd60a6c4eaa8cc5801d2629eaaf3632815b8862b680691c" Nov 29 15:08:37 crc kubenswrapper[4907]: I1129 15:08:37.073840 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e341557d290cc13b0fd60a6c4eaa8cc5801d2629eaaf3632815b8862b680691c"} err="failed to get container status \"e341557d290cc13b0fd60a6c4eaa8cc5801d2629eaaf3632815b8862b680691c\": rpc error: code = NotFound desc = could not find container \"e341557d290cc13b0fd60a6c4eaa8cc5801d2629eaaf3632815b8862b680691c\": container with ID starting with e341557d290cc13b0fd60a6c4eaa8cc5801d2629eaaf3632815b8862b680691c not found: ID does not exist" Nov 29 15:08:37 crc kubenswrapper[4907]: I1129 15:08:37.760208 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:37 crc kubenswrapper[4907]: I1129 15:08:37.760623 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:37 crc kubenswrapper[4907]: I1129 15:08:37.842203 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:38 crc kubenswrapper[4907]: I1129 15:08:38.033345 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:38 crc kubenswrapper[4907]: I1129 15:08:38.482968 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:08:38 crc kubenswrapper[4907]: E1129 15:08:38.483328 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:08:38 crc kubenswrapper[4907]: I1129 15:08:38.494522 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="398328dc-dc65-4290-a5a4-b45a261a40ea" path="/var/lib/kubelet/pods/398328dc-dc65-4290-a5a4-b45a261a40ea/volumes" Nov 29 15:08:42 crc kubenswrapper[4907]: I1129 15:08:42.599003 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjlb8"] Nov 29 15:08:42 crc kubenswrapper[4907]: I1129 15:08:42.599777 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tjlb8" podUID="9a43ab2d-d969-493b-8c18-0e9c90070982" containerName="registry-server" containerID="cri-o://0c23e4fbd4ee5bf7b03168aa7dee123d30b9c31cfeefcd66b63d31c8b1d319da" gracePeriod=2 Nov 29 15:08:43 crc kubenswrapper[4907]: I1129 15:08:43.041428 4907 generic.go:334] "Generic (PLEG): container finished" podID="9a43ab2d-d969-493b-8c18-0e9c90070982" containerID="0c23e4fbd4ee5bf7b03168aa7dee123d30b9c31cfeefcd66b63d31c8b1d319da" exitCode=0 Nov 29 15:08:43 crc kubenswrapper[4907]: I1129 15:08:43.041827 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjlb8" event={"ID":"9a43ab2d-d969-493b-8c18-0e9c90070982","Type":"ContainerDied","Data":"0c23e4fbd4ee5bf7b03168aa7dee123d30b9c31cfeefcd66b63d31c8b1d319da"} Nov 29 15:08:43 crc kubenswrapper[4907]: I1129 15:08:43.211500 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:43 crc kubenswrapper[4907]: I1129 15:08:43.357248 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxwrj\" (UniqueName: \"kubernetes.io/projected/9a43ab2d-d969-493b-8c18-0e9c90070982-kube-api-access-jxwrj\") pod \"9a43ab2d-d969-493b-8c18-0e9c90070982\" (UID: \"9a43ab2d-d969-493b-8c18-0e9c90070982\") " Nov 29 15:08:43 crc kubenswrapper[4907]: I1129 15:08:43.357395 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a43ab2d-d969-493b-8c18-0e9c90070982-catalog-content\") pod \"9a43ab2d-d969-493b-8c18-0e9c90070982\" (UID: \"9a43ab2d-d969-493b-8c18-0e9c90070982\") " Nov 29 15:08:43 crc kubenswrapper[4907]: I1129 15:08:43.357668 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a43ab2d-d969-493b-8c18-0e9c90070982-utilities\") pod \"9a43ab2d-d969-493b-8c18-0e9c90070982\" (UID: \"9a43ab2d-d969-493b-8c18-0e9c90070982\") " Nov 29 15:08:43 crc kubenswrapper[4907]: I1129 15:08:43.358806 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a43ab2d-d969-493b-8c18-0e9c90070982-utilities" (OuterVolumeSpecName: "utilities") pod "9a43ab2d-d969-493b-8c18-0e9c90070982" (UID: "9a43ab2d-d969-493b-8c18-0e9c90070982"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:08:43 crc kubenswrapper[4907]: I1129 15:08:43.369992 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a43ab2d-d969-493b-8c18-0e9c90070982-kube-api-access-jxwrj" (OuterVolumeSpecName: "kube-api-access-jxwrj") pod "9a43ab2d-d969-493b-8c18-0e9c90070982" (UID: "9a43ab2d-d969-493b-8c18-0e9c90070982"). InnerVolumeSpecName "kube-api-access-jxwrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:08:43 crc kubenswrapper[4907]: I1129 15:08:43.391420 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a43ab2d-d969-493b-8c18-0e9c90070982-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a43ab2d-d969-493b-8c18-0e9c90070982" (UID: "9a43ab2d-d969-493b-8c18-0e9c90070982"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:08:43 crc kubenswrapper[4907]: I1129 15:08:43.461278 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxwrj\" (UniqueName: \"kubernetes.io/projected/9a43ab2d-d969-493b-8c18-0e9c90070982-kube-api-access-jxwrj\") on node \"crc\" DevicePath \"\"" Nov 29 15:08:43 crc kubenswrapper[4907]: I1129 15:08:43.462155 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a43ab2d-d969-493b-8c18-0e9c90070982-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:08:43 crc kubenswrapper[4907]: I1129 15:08:43.462190 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a43ab2d-d969-493b-8c18-0e9c90070982-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:08:44 crc kubenswrapper[4907]: I1129 15:08:44.063109 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjlb8" event={"ID":"9a43ab2d-d969-493b-8c18-0e9c90070982","Type":"ContainerDied","Data":"e5f5a4e490fdd956be59a6d782f6a680d873f05a049fcdb8f5d6ad2d040804ee"} Nov 29 15:08:44 crc kubenswrapper[4907]: I1129 15:08:44.063807 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjlb8" Nov 29 15:08:44 crc kubenswrapper[4907]: I1129 15:08:44.064658 4907 scope.go:117] "RemoveContainer" containerID="0c23e4fbd4ee5bf7b03168aa7dee123d30b9c31cfeefcd66b63d31c8b1d319da" Nov 29 15:08:44 crc kubenswrapper[4907]: I1129 15:08:44.102253 4907 scope.go:117] "RemoveContainer" containerID="37b3fdf02793522fccfaa17edb6354653143af87034c87c5e652a6ca2183d3fb" Nov 29 15:08:44 crc kubenswrapper[4907]: I1129 15:08:44.111400 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjlb8"] Nov 29 15:08:44 crc kubenswrapper[4907]: I1129 15:08:44.128108 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjlb8"] Nov 29 15:08:44 crc kubenswrapper[4907]: I1129 15:08:44.137920 4907 scope.go:117] "RemoveContainer" containerID="cb7d57641a4efeac4c047e70857d3392d4c07798455df675443b2368dc3b86d8" Nov 29 15:08:44 crc kubenswrapper[4907]: I1129 15:08:44.509103 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a43ab2d-d969-493b-8c18-0e9c90070982" path="/var/lib/kubelet/pods/9a43ab2d-d969-493b-8c18-0e9c90070982/volumes" Nov 29 15:08:53 crc kubenswrapper[4907]: I1129 15:08:53.480015 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:08:53 crc kubenswrapper[4907]: E1129 15:08:53.482053 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:08:58 crc kubenswrapper[4907]: I1129 15:08:58.319292 4907 generic.go:334] "Generic (PLEG): container finished" podID="dd16d4b2-3a9d-4a05-8564-3de313928ab8" containerID="f20df86f915926cab807231376733a75baa8e7b560a2eccd8398ef27c0c80ed9" exitCode=0 Nov 29 15:08:58 crc kubenswrapper[4907]: I1129 15:08:58.319348 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" event={"ID":"dd16d4b2-3a9d-4a05-8564-3de313928ab8","Type":"ContainerDied","Data":"f20df86f915926cab807231376733a75baa8e7b560a2eccd8398ef27c0c80ed9"} Nov 29 15:08:59 crc kubenswrapper[4907]: I1129 15:08:59.934576 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.053112 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-neutron-metadata-combined-ca-bundle\") pod \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.053202 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-inventory\") pod \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.053754 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcd9t\" (UniqueName: \"kubernetes.io/projected/dd16d4b2-3a9d-4a05-8564-3de313928ab8-kube-api-access-lcd9t\") pod \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.053895 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.054013 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-ssh-key\") pod \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.054137 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-nova-metadata-neutron-config-0\") pod \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\" (UID: \"dd16d4b2-3a9d-4a05-8564-3de313928ab8\") " Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.061719 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "dd16d4b2-3a9d-4a05-8564-3de313928ab8" (UID: "dd16d4b2-3a9d-4a05-8564-3de313928ab8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.062883 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd16d4b2-3a9d-4a05-8564-3de313928ab8-kube-api-access-lcd9t" (OuterVolumeSpecName: "kube-api-access-lcd9t") pod "dd16d4b2-3a9d-4a05-8564-3de313928ab8" (UID: "dd16d4b2-3a9d-4a05-8564-3de313928ab8"). InnerVolumeSpecName "kube-api-access-lcd9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.097326 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dd16d4b2-3a9d-4a05-8564-3de313928ab8" (UID: "dd16d4b2-3a9d-4a05-8564-3de313928ab8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.111790 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-inventory" (OuterVolumeSpecName: "inventory") pod "dd16d4b2-3a9d-4a05-8564-3de313928ab8" (UID: "dd16d4b2-3a9d-4a05-8564-3de313928ab8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.121960 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "dd16d4b2-3a9d-4a05-8564-3de313928ab8" (UID: "dd16d4b2-3a9d-4a05-8564-3de313928ab8"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.123583 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "dd16d4b2-3a9d-4a05-8564-3de313928ab8" (UID: "dd16d4b2-3a9d-4a05-8564-3de313928ab8"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.157852 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.158189 4907 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.158290 4907 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.158372 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.158464 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcd9t\" (UniqueName: \"kubernetes.io/projected/dd16d4b2-3a9d-4a05-8564-3de313928ab8-kube-api-access-lcd9t\") on node \"crc\" DevicePath \"\"" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.158557 4907 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd16d4b2-3a9d-4a05-8564-3de313928ab8-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.353984 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" event={"ID":"dd16d4b2-3a9d-4a05-8564-3de313928ab8","Type":"ContainerDied","Data":"2b7ff84ae6ced703db096d1d50d39f0cc0e72523d0125cd8885f2056fe77b76f"} Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.354345 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b7ff84ae6ced703db096d1d50d39f0cc0e72523d0125cd8885f2056fe77b76f" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.354068 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.463114 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns"] Nov 29 15:09:00 crc kubenswrapper[4907]: E1129 15:09:00.463583 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a43ab2d-d969-493b-8c18-0e9c90070982" containerName="extract-utilities" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.463601 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a43ab2d-d969-493b-8c18-0e9c90070982" containerName="extract-utilities" Nov 29 15:09:00 crc kubenswrapper[4907]: E1129 15:09:00.463626 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398328dc-dc65-4290-a5a4-b45a261a40ea" containerName="registry-server" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.463633 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="398328dc-dc65-4290-a5a4-b45a261a40ea" containerName="registry-server" Nov 29 15:09:00 crc kubenswrapper[4907]: E1129 15:09:00.463651 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a43ab2d-d969-493b-8c18-0e9c90070982" containerName="extract-content" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.463657 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a43ab2d-d969-493b-8c18-0e9c90070982" containerName="extract-content" Nov 29 15:09:00 crc kubenswrapper[4907]: E1129 15:09:00.463671 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a43ab2d-d969-493b-8c18-0e9c90070982" containerName="registry-server" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.463676 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a43ab2d-d969-493b-8c18-0e9c90070982" containerName="registry-server" Nov 29 15:09:00 crc kubenswrapper[4907]: E1129 15:09:00.463699 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd16d4b2-3a9d-4a05-8564-3de313928ab8" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.463706 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd16d4b2-3a9d-4a05-8564-3de313928ab8" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 29 15:09:00 crc kubenswrapper[4907]: E1129 15:09:00.463718 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398328dc-dc65-4290-a5a4-b45a261a40ea" containerName="extract-content" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.463723 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="398328dc-dc65-4290-a5a4-b45a261a40ea" containerName="extract-content" Nov 29 15:09:00 crc kubenswrapper[4907]: E1129 15:09:00.463739 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398328dc-dc65-4290-a5a4-b45a261a40ea" containerName="extract-utilities" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.463745 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="398328dc-dc65-4290-a5a4-b45a261a40ea" containerName="extract-utilities" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.463996 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a43ab2d-d969-493b-8c18-0e9c90070982" containerName="registry-server" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.464032 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="398328dc-dc65-4290-a5a4-b45a261a40ea" containerName="registry-server" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.464057 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd16d4b2-3a9d-4a05-8564-3de313928ab8" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.465178 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.468078 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.468177 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.468558 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.470810 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.471483 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.477281 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns"] Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.573476 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fprns\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.573545 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fprns\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.573584 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fprns\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.573726 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dvjm\" (UniqueName: \"kubernetes.io/projected/49802986-4e60-418a-9c0b-5263ebef0944-kube-api-access-9dvjm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fprns\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.574106 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fprns\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.677288 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fprns\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.677344 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fprns\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.677390 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fprns\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.677503 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dvjm\" (UniqueName: \"kubernetes.io/projected/49802986-4e60-418a-9c0b-5263ebef0944-kube-api-access-9dvjm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fprns\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.677620 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fprns\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.682030 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fprns\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.685784 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fprns\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.695893 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fprns\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.702419 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dvjm\" (UniqueName: \"kubernetes.io/projected/49802986-4e60-418a-9c0b-5263ebef0944-kube-api-access-9dvjm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fprns\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.702940 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fprns\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:00 crc kubenswrapper[4907]: I1129 15:09:00.796287 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:09:01 crc kubenswrapper[4907]: I1129 15:09:01.431322 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns"] Nov 29 15:09:02 crc kubenswrapper[4907]: I1129 15:09:02.385484 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" event={"ID":"49802986-4e60-418a-9c0b-5263ebef0944","Type":"ContainerStarted","Data":"e9094c9296c64305529f3254dfba5766133a4288e97a0f7e8103ce49ab1c5e57"} Nov 29 15:09:03 crc kubenswrapper[4907]: I1129 15:09:03.450210 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" event={"ID":"49802986-4e60-418a-9c0b-5263ebef0944","Type":"ContainerStarted","Data":"049fc81e2dd3b5582d8631054c2793349442a2e5c6edc8838b3ce9ed2fdc9104"} Nov 29 15:09:03 crc kubenswrapper[4907]: I1129 15:09:03.471250 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" podStartSLOduration=2.832680653 podStartE2EDuration="3.471232798s" podCreationTimestamp="2025-11-29 15:09:00 +0000 UTC" firstStartedPulling="2025-11-29 15:09:01.453546443 +0000 UTC m=+2439.440384135" lastFinishedPulling="2025-11-29 15:09:02.092098598 +0000 UTC m=+2440.078936280" observedRunningTime="2025-11-29 15:09:03.467141773 +0000 UTC m=+2441.453979425" watchObservedRunningTime="2025-11-29 15:09:03.471232798 +0000 UTC m=+2441.458070450" Nov 29 15:09:04 crc kubenswrapper[4907]: I1129 15:09:04.482381 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:09:04 crc kubenswrapper[4907]: E1129 15:09:04.483051 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:09:17 crc kubenswrapper[4907]: I1129 15:09:17.480823 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:09:17 crc kubenswrapper[4907]: E1129 15:09:17.482049 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:09:30 crc kubenswrapper[4907]: I1129 15:09:30.479736 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:09:30 crc kubenswrapper[4907]: E1129 15:09:30.480785 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:09:41 crc kubenswrapper[4907]: I1129 15:09:41.482428 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:09:41 crc kubenswrapper[4907]: E1129 15:09:41.483155 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:09:56 crc kubenswrapper[4907]: I1129 15:09:56.480597 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:09:56 crc kubenswrapper[4907]: E1129 15:09:56.481676 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:10:10 crc kubenswrapper[4907]: I1129 15:10:10.480256 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:10:10 crc kubenswrapper[4907]: E1129 15:10:10.481101 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:10:22 crc kubenswrapper[4907]: I1129 15:10:22.492333 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:10:22 crc kubenswrapper[4907]: E1129 15:10:22.493380 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:10:35 crc kubenswrapper[4907]: I1129 15:10:35.480753 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:10:35 crc kubenswrapper[4907]: E1129 15:10:35.481878 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:10:47 crc kubenswrapper[4907]: I1129 15:10:47.480502 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:10:47 crc kubenswrapper[4907]: E1129 15:10:47.481699 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:11:02 crc kubenswrapper[4907]: I1129 15:11:02.486998 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:11:02 crc kubenswrapper[4907]: E1129 15:11:02.488533 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:11:16 crc kubenswrapper[4907]: I1129 15:11:16.479616 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:11:16 crc kubenswrapper[4907]: E1129 15:11:16.481997 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:11:29 crc kubenswrapper[4907]: I1129 15:11:29.480588 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:11:30 crc kubenswrapper[4907]: I1129 15:11:30.517167 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"16bb77d19bc18e0759ca15061f56e4df0e7fb5ad0c1ada9c6f037f6c741d3992"} Nov 29 15:13:36 crc kubenswrapper[4907]: I1129 15:13:36.561744 4907 generic.go:334] "Generic (PLEG): container finished" podID="49802986-4e60-418a-9c0b-5263ebef0944" containerID="049fc81e2dd3b5582d8631054c2793349442a2e5c6edc8838b3ce9ed2fdc9104" exitCode=0 Nov 29 15:13:36 crc kubenswrapper[4907]: I1129 15:13:36.561830 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" event={"ID":"49802986-4e60-418a-9c0b-5263ebef0944","Type":"ContainerDied","Data":"049fc81e2dd3b5582d8631054c2793349442a2e5c6edc8838b3ce9ed2fdc9104"} Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.133170 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.274507 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-libvirt-secret-0\") pod \"49802986-4e60-418a-9c0b-5263ebef0944\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.275470 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dvjm\" (UniqueName: \"kubernetes.io/projected/49802986-4e60-418a-9c0b-5263ebef0944-kube-api-access-9dvjm\") pod \"49802986-4e60-418a-9c0b-5263ebef0944\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.275791 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-libvirt-combined-ca-bundle\") pod \"49802986-4e60-418a-9c0b-5263ebef0944\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.276002 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-inventory\") pod \"49802986-4e60-418a-9c0b-5263ebef0944\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.276279 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-ssh-key\") pod \"49802986-4e60-418a-9c0b-5263ebef0944\" (UID: \"49802986-4e60-418a-9c0b-5263ebef0944\") " Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.282586 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "49802986-4e60-418a-9c0b-5263ebef0944" (UID: "49802986-4e60-418a-9c0b-5263ebef0944"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.282965 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49802986-4e60-418a-9c0b-5263ebef0944-kube-api-access-9dvjm" (OuterVolumeSpecName: "kube-api-access-9dvjm") pod "49802986-4e60-418a-9c0b-5263ebef0944" (UID: "49802986-4e60-418a-9c0b-5263ebef0944"). InnerVolumeSpecName "kube-api-access-9dvjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.324945 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-inventory" (OuterVolumeSpecName: "inventory") pod "49802986-4e60-418a-9c0b-5263ebef0944" (UID: "49802986-4e60-418a-9c0b-5263ebef0944"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.343923 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "49802986-4e60-418a-9c0b-5263ebef0944" (UID: "49802986-4e60-418a-9c0b-5263ebef0944"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.345815 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "49802986-4e60-418a-9c0b-5263ebef0944" (UID: "49802986-4e60-418a-9c0b-5263ebef0944"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.381177 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dvjm\" (UniqueName: \"kubernetes.io/projected/49802986-4e60-418a-9c0b-5263ebef0944-kube-api-access-9dvjm\") on node \"crc\" DevicePath \"\"" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.381214 4907 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.381225 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.381234 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.381246 4907 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/49802986-4e60-418a-9c0b-5263ebef0944-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.588571 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" event={"ID":"49802986-4e60-418a-9c0b-5263ebef0944","Type":"ContainerDied","Data":"e9094c9296c64305529f3254dfba5766133a4288e97a0f7e8103ce49ab1c5e57"} Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.588619 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9094c9296c64305529f3254dfba5766133a4288e97a0f7e8103ce49ab1c5e57" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.588625 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fprns" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.709196 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5"] Nov 29 15:13:38 crc kubenswrapper[4907]: E1129 15:13:38.710203 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49802986-4e60-418a-9c0b-5263ebef0944" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.710226 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="49802986-4e60-418a-9c0b-5263ebef0944" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.710526 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="49802986-4e60-418a-9c0b-5263ebef0944" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.711465 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.713391 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.720663 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.721088 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.721474 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.721668 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.721871 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.722722 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.722787 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5"] Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.794648 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.794735 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.794835 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.794860 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.794887 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.794917 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.794940 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.795077 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.795107 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp8d5\" (UniqueName: \"kubernetes.io/projected/9af53af7-2ded-4b44-92c8-85cb98ea6519-kube-api-access-pp8d5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.896511 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.896965 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.897290 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.897634 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.897845 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.898068 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.898262 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.898584 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.898883 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp8d5\" (UniqueName: \"kubernetes.io/projected/9af53af7-2ded-4b44-92c8-85cb98ea6519-kube-api-access-pp8d5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.902435 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.902417 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.904043 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.904161 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.907229 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.908424 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.909298 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.910716 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:38 crc kubenswrapper[4907]: I1129 15:13:38.922779 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp8d5\" (UniqueName: \"kubernetes.io/projected/9af53af7-2ded-4b44-92c8-85cb98ea6519-kube-api-access-pp8d5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8mj5\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:39 crc kubenswrapper[4907]: I1129 15:13:39.037191 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:13:39 crc kubenswrapper[4907]: I1129 15:13:39.721163 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5"] Nov 29 15:13:39 crc kubenswrapper[4907]: I1129 15:13:39.731653 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 15:13:40 crc kubenswrapper[4907]: I1129 15:13:40.627700 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" event={"ID":"9af53af7-2ded-4b44-92c8-85cb98ea6519","Type":"ContainerStarted","Data":"b510694b4554ae8ff0989fe6c260fbf2c0d6ab034f103bf021d269db9790d48b"} Nov 29 15:13:41 crc kubenswrapper[4907]: I1129 15:13:41.646392 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" event={"ID":"9af53af7-2ded-4b44-92c8-85cb98ea6519","Type":"ContainerStarted","Data":"ef05c9921a7573cd14cb543db14d3a47a595df0e47a505114ddb9d71a01546e5"} Nov 29 15:13:41 crc kubenswrapper[4907]: I1129 15:13:41.693294 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" podStartSLOduration=3.049094904 podStartE2EDuration="3.693264737s" podCreationTimestamp="2025-11-29 15:13:38 +0000 UTC" firstStartedPulling="2025-11-29 15:13:39.73007629 +0000 UTC m=+2717.716913942" lastFinishedPulling="2025-11-29 15:13:40.374246113 +0000 UTC m=+2718.361083775" observedRunningTime="2025-11-29 15:13:41.673871569 +0000 UTC m=+2719.660709251" watchObservedRunningTime="2025-11-29 15:13:41.693264737 +0000 UTC m=+2719.680102419" Nov 29 15:13:58 crc kubenswrapper[4907]: I1129 15:13:58.490153 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:13:58 crc kubenswrapper[4907]: I1129 15:13:58.490849 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:14:17 crc kubenswrapper[4907]: I1129 15:14:17.197338 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wbtr8"] Nov 29 15:14:17 crc kubenswrapper[4907]: I1129 15:14:17.201117 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:17 crc kubenswrapper[4907]: I1129 15:14:17.213273 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wbtr8"] Nov 29 15:14:17 crc kubenswrapper[4907]: I1129 15:14:17.259022 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675a61c6-a4e7-44dc-8267-e742ea49c87f-utilities\") pod \"redhat-operators-wbtr8\" (UID: \"675a61c6-a4e7-44dc-8267-e742ea49c87f\") " pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:17 crc kubenswrapper[4907]: I1129 15:14:17.259130 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675a61c6-a4e7-44dc-8267-e742ea49c87f-catalog-content\") pod \"redhat-operators-wbtr8\" (UID: \"675a61c6-a4e7-44dc-8267-e742ea49c87f\") " pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:17 crc kubenswrapper[4907]: I1129 15:14:17.259344 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84dz5\" (UniqueName: \"kubernetes.io/projected/675a61c6-a4e7-44dc-8267-e742ea49c87f-kube-api-access-84dz5\") pod \"redhat-operators-wbtr8\" (UID: \"675a61c6-a4e7-44dc-8267-e742ea49c87f\") " pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:17 crc kubenswrapper[4907]: I1129 15:14:17.360534 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675a61c6-a4e7-44dc-8267-e742ea49c87f-utilities\") pod \"redhat-operators-wbtr8\" (UID: \"675a61c6-a4e7-44dc-8267-e742ea49c87f\") " pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:17 crc kubenswrapper[4907]: I1129 15:14:17.360608 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675a61c6-a4e7-44dc-8267-e742ea49c87f-catalog-content\") pod \"redhat-operators-wbtr8\" (UID: \"675a61c6-a4e7-44dc-8267-e742ea49c87f\") " pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:17 crc kubenswrapper[4907]: I1129 15:14:17.360659 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84dz5\" (UniqueName: \"kubernetes.io/projected/675a61c6-a4e7-44dc-8267-e742ea49c87f-kube-api-access-84dz5\") pod \"redhat-operators-wbtr8\" (UID: \"675a61c6-a4e7-44dc-8267-e742ea49c87f\") " pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:17 crc kubenswrapper[4907]: I1129 15:14:17.361224 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675a61c6-a4e7-44dc-8267-e742ea49c87f-utilities\") pod \"redhat-operators-wbtr8\" (UID: \"675a61c6-a4e7-44dc-8267-e742ea49c87f\") " pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:17 crc kubenswrapper[4907]: I1129 15:14:17.361710 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675a61c6-a4e7-44dc-8267-e742ea49c87f-catalog-content\") pod \"redhat-operators-wbtr8\" (UID: \"675a61c6-a4e7-44dc-8267-e742ea49c87f\") " pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:17 crc kubenswrapper[4907]: I1129 15:14:17.380667 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84dz5\" (UniqueName: \"kubernetes.io/projected/675a61c6-a4e7-44dc-8267-e742ea49c87f-kube-api-access-84dz5\") pod \"redhat-operators-wbtr8\" (UID: \"675a61c6-a4e7-44dc-8267-e742ea49c87f\") " pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:17 crc kubenswrapper[4907]: I1129 15:14:17.532244 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:18 crc kubenswrapper[4907]: I1129 15:14:18.093164 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wbtr8"] Nov 29 15:14:18 crc kubenswrapper[4907]: I1129 15:14:18.154734 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbtr8" event={"ID":"675a61c6-a4e7-44dc-8267-e742ea49c87f","Type":"ContainerStarted","Data":"bd6782c9bac5a43338f5b2bcda8f8e3271daed7ab15406c30a11efd4873be0df"} Nov 29 15:14:19 crc kubenswrapper[4907]: I1129 15:14:19.200100 4907 generic.go:334] "Generic (PLEG): container finished" podID="675a61c6-a4e7-44dc-8267-e742ea49c87f" containerID="a90d9ceabd418205bf46be8fca12a81eb8d7529ffbaab2e2b28a964fdba93b82" exitCode=0 Nov 29 15:14:19 crc kubenswrapper[4907]: I1129 15:14:19.200516 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbtr8" event={"ID":"675a61c6-a4e7-44dc-8267-e742ea49c87f","Type":"ContainerDied","Data":"a90d9ceabd418205bf46be8fca12a81eb8d7529ffbaab2e2b28a964fdba93b82"} Nov 29 15:14:21 crc kubenswrapper[4907]: I1129 15:14:21.235755 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbtr8" event={"ID":"675a61c6-a4e7-44dc-8267-e742ea49c87f","Type":"ContainerStarted","Data":"5ba7c5522f9b6585014e0cc99187d1726f7b69c78dbea0319912ce7d64a4cbfa"} Nov 29 15:14:24 crc kubenswrapper[4907]: I1129 15:14:24.281322 4907 generic.go:334] "Generic (PLEG): container finished" podID="675a61c6-a4e7-44dc-8267-e742ea49c87f" containerID="5ba7c5522f9b6585014e0cc99187d1726f7b69c78dbea0319912ce7d64a4cbfa" exitCode=0 Nov 29 15:14:24 crc kubenswrapper[4907]: I1129 15:14:24.281399 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbtr8" event={"ID":"675a61c6-a4e7-44dc-8267-e742ea49c87f","Type":"ContainerDied","Data":"5ba7c5522f9b6585014e0cc99187d1726f7b69c78dbea0319912ce7d64a4cbfa"} Nov 29 15:14:25 crc kubenswrapper[4907]: I1129 15:14:25.301326 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbtr8" event={"ID":"675a61c6-a4e7-44dc-8267-e742ea49c87f","Type":"ContainerStarted","Data":"a88544a9d09b9ce0025e6db225ef28e4191ce4006a3c1e0167e7aa029dd7ab64"} Nov 29 15:14:25 crc kubenswrapper[4907]: I1129 15:14:25.336532 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wbtr8" podStartSLOduration=2.81278267 podStartE2EDuration="8.336511812s" podCreationTimestamp="2025-11-29 15:14:17 +0000 UTC" firstStartedPulling="2025-11-29 15:14:19.207182276 +0000 UTC m=+2757.194019968" lastFinishedPulling="2025-11-29 15:14:24.730911458 +0000 UTC m=+2762.717749110" observedRunningTime="2025-11-29 15:14:25.326727056 +0000 UTC m=+2763.313564718" watchObservedRunningTime="2025-11-29 15:14:25.336511812 +0000 UTC m=+2763.323349464" Nov 29 15:14:27 crc kubenswrapper[4907]: I1129 15:14:27.532485 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:27 crc kubenswrapper[4907]: I1129 15:14:27.532769 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:28 crc kubenswrapper[4907]: I1129 15:14:28.489713 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:14:28 crc kubenswrapper[4907]: I1129 15:14:28.489779 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:14:28 crc kubenswrapper[4907]: I1129 15:14:28.619524 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wbtr8" podUID="675a61c6-a4e7-44dc-8267-e742ea49c87f" containerName="registry-server" probeResult="failure" output=< Nov 29 15:14:28 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 15:14:28 crc kubenswrapper[4907]: > Nov 29 15:14:37 crc kubenswrapper[4907]: I1129 15:14:37.634752 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:37 crc kubenswrapper[4907]: I1129 15:14:37.689601 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:41 crc kubenswrapper[4907]: I1129 15:14:41.157871 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wbtr8"] Nov 29 15:14:41 crc kubenswrapper[4907]: I1129 15:14:41.158951 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wbtr8" podUID="675a61c6-a4e7-44dc-8267-e742ea49c87f" containerName="registry-server" containerID="cri-o://a88544a9d09b9ce0025e6db225ef28e4191ce4006a3c1e0167e7aa029dd7ab64" gracePeriod=2 Nov 29 15:14:41 crc kubenswrapper[4907]: I1129 15:14:41.559960 4907 generic.go:334] "Generic (PLEG): container finished" podID="675a61c6-a4e7-44dc-8267-e742ea49c87f" containerID="a88544a9d09b9ce0025e6db225ef28e4191ce4006a3c1e0167e7aa029dd7ab64" exitCode=0 Nov 29 15:14:41 crc kubenswrapper[4907]: I1129 15:14:41.560020 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbtr8" event={"ID":"675a61c6-a4e7-44dc-8267-e742ea49c87f","Type":"ContainerDied","Data":"a88544a9d09b9ce0025e6db225ef28e4191ce4006a3c1e0167e7aa029dd7ab64"} Nov 29 15:14:41 crc kubenswrapper[4907]: I1129 15:14:41.807318 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:41 crc kubenswrapper[4907]: I1129 15:14:41.953789 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84dz5\" (UniqueName: \"kubernetes.io/projected/675a61c6-a4e7-44dc-8267-e742ea49c87f-kube-api-access-84dz5\") pod \"675a61c6-a4e7-44dc-8267-e742ea49c87f\" (UID: \"675a61c6-a4e7-44dc-8267-e742ea49c87f\") " Nov 29 15:14:41 crc kubenswrapper[4907]: I1129 15:14:41.954327 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675a61c6-a4e7-44dc-8267-e742ea49c87f-utilities\") pod \"675a61c6-a4e7-44dc-8267-e742ea49c87f\" (UID: \"675a61c6-a4e7-44dc-8267-e742ea49c87f\") " Nov 29 15:14:41 crc kubenswrapper[4907]: I1129 15:14:41.954429 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675a61c6-a4e7-44dc-8267-e742ea49c87f-catalog-content\") pod \"675a61c6-a4e7-44dc-8267-e742ea49c87f\" (UID: \"675a61c6-a4e7-44dc-8267-e742ea49c87f\") " Nov 29 15:14:41 crc kubenswrapper[4907]: I1129 15:14:41.955743 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/675a61c6-a4e7-44dc-8267-e742ea49c87f-utilities" (OuterVolumeSpecName: "utilities") pod "675a61c6-a4e7-44dc-8267-e742ea49c87f" (UID: "675a61c6-a4e7-44dc-8267-e742ea49c87f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:14:41 crc kubenswrapper[4907]: I1129 15:14:41.960673 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/675a61c6-a4e7-44dc-8267-e742ea49c87f-kube-api-access-84dz5" (OuterVolumeSpecName: "kube-api-access-84dz5") pod "675a61c6-a4e7-44dc-8267-e742ea49c87f" (UID: "675a61c6-a4e7-44dc-8267-e742ea49c87f"). InnerVolumeSpecName "kube-api-access-84dz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:14:42 crc kubenswrapper[4907]: I1129 15:14:42.057889 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84dz5\" (UniqueName: \"kubernetes.io/projected/675a61c6-a4e7-44dc-8267-e742ea49c87f-kube-api-access-84dz5\") on node \"crc\" DevicePath \"\"" Nov 29 15:14:42 crc kubenswrapper[4907]: I1129 15:14:42.057935 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675a61c6-a4e7-44dc-8267-e742ea49c87f-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:14:42 crc kubenswrapper[4907]: I1129 15:14:42.067492 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/675a61c6-a4e7-44dc-8267-e742ea49c87f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "675a61c6-a4e7-44dc-8267-e742ea49c87f" (UID: "675a61c6-a4e7-44dc-8267-e742ea49c87f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:14:42 crc kubenswrapper[4907]: I1129 15:14:42.161171 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675a61c6-a4e7-44dc-8267-e742ea49c87f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:14:42 crc kubenswrapper[4907]: I1129 15:14:42.581086 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbtr8" event={"ID":"675a61c6-a4e7-44dc-8267-e742ea49c87f","Type":"ContainerDied","Data":"bd6782c9bac5a43338f5b2bcda8f8e3271daed7ab15406c30a11efd4873be0df"} Nov 29 15:14:42 crc kubenswrapper[4907]: I1129 15:14:42.581122 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbtr8" Nov 29 15:14:42 crc kubenswrapper[4907]: I1129 15:14:42.581171 4907 scope.go:117] "RemoveContainer" containerID="a88544a9d09b9ce0025e6db225ef28e4191ce4006a3c1e0167e7aa029dd7ab64" Nov 29 15:14:42 crc kubenswrapper[4907]: I1129 15:14:42.616216 4907 scope.go:117] "RemoveContainer" containerID="5ba7c5522f9b6585014e0cc99187d1726f7b69c78dbea0319912ce7d64a4cbfa" Nov 29 15:14:42 crc kubenswrapper[4907]: I1129 15:14:42.636682 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wbtr8"] Nov 29 15:14:42 crc kubenswrapper[4907]: I1129 15:14:42.651933 4907 scope.go:117] "RemoveContainer" containerID="a90d9ceabd418205bf46be8fca12a81eb8d7529ffbaab2e2b28a964fdba93b82" Nov 29 15:14:42 crc kubenswrapper[4907]: I1129 15:14:42.657686 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wbtr8"] Nov 29 15:14:44 crc kubenswrapper[4907]: I1129 15:14:44.544222 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="675a61c6-a4e7-44dc-8267-e742ea49c87f" path="/var/lib/kubelet/pods/675a61c6-a4e7-44dc-8267-e742ea49c87f/volumes" Nov 29 15:14:58 crc kubenswrapper[4907]: I1129 15:14:58.490542 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:14:58 crc kubenswrapper[4907]: I1129 15:14:58.491389 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:14:58 crc kubenswrapper[4907]: I1129 15:14:58.498039 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 15:14:58 crc kubenswrapper[4907]: I1129 15:14:58.499282 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16bb77d19bc18e0759ca15061f56e4df0e7fb5ad0c1ada9c6f037f6c741d3992"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 15:14:58 crc kubenswrapper[4907]: I1129 15:14:58.499391 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://16bb77d19bc18e0759ca15061f56e4df0e7fb5ad0c1ada9c6f037f6c741d3992" gracePeriod=600 Nov 29 15:14:58 crc kubenswrapper[4907]: I1129 15:14:58.838799 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="16bb77d19bc18e0759ca15061f56e4df0e7fb5ad0c1ada9c6f037f6c741d3992" exitCode=0 Nov 29 15:14:58 crc kubenswrapper[4907]: I1129 15:14:58.838879 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"16bb77d19bc18e0759ca15061f56e4df0e7fb5ad0c1ada9c6f037f6c741d3992"} Nov 29 15:14:58 crc kubenswrapper[4907]: I1129 15:14:58.839192 4907 scope.go:117] "RemoveContainer" containerID="86e7a8da7f8c917c047800d1f2c9a31edbd78967804163c4c5c3172e8b2f01e7" Nov 29 15:14:59 crc kubenswrapper[4907]: I1129 15:14:59.857648 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca"} Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.178896 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd"] Nov 29 15:15:00 crc kubenswrapper[4907]: E1129 15:15:00.180514 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675a61c6-a4e7-44dc-8267-e742ea49c87f" containerName="extract-utilities" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.180601 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="675a61c6-a4e7-44dc-8267-e742ea49c87f" containerName="extract-utilities" Nov 29 15:15:00 crc kubenswrapper[4907]: E1129 15:15:00.180738 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675a61c6-a4e7-44dc-8267-e742ea49c87f" containerName="registry-server" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.180825 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="675a61c6-a4e7-44dc-8267-e742ea49c87f" containerName="registry-server" Nov 29 15:15:00 crc kubenswrapper[4907]: E1129 15:15:00.180910 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675a61c6-a4e7-44dc-8267-e742ea49c87f" containerName="extract-content" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.180933 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="675a61c6-a4e7-44dc-8267-e742ea49c87f" containerName="extract-content" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.181902 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="675a61c6-a4e7-44dc-8267-e742ea49c87f" containerName="registry-server" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.192619 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.196470 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.196466 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.213794 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd"] Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.275001 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a685f890-93f4-40fc-8613-d7c3a331d927-config-volume\") pod \"collect-profiles-29407155-jclsd\" (UID: \"a685f890-93f4-40fc-8613-d7c3a331d927\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.275058 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t5t9\" (UniqueName: \"kubernetes.io/projected/a685f890-93f4-40fc-8613-d7c3a331d927-kube-api-access-7t5t9\") pod \"collect-profiles-29407155-jclsd\" (UID: \"a685f890-93f4-40fc-8613-d7c3a331d927\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.275222 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a685f890-93f4-40fc-8613-d7c3a331d927-secret-volume\") pod \"collect-profiles-29407155-jclsd\" (UID: \"a685f890-93f4-40fc-8613-d7c3a331d927\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.377649 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a685f890-93f4-40fc-8613-d7c3a331d927-secret-volume\") pod \"collect-profiles-29407155-jclsd\" (UID: \"a685f890-93f4-40fc-8613-d7c3a331d927\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.377891 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a685f890-93f4-40fc-8613-d7c3a331d927-config-volume\") pod \"collect-profiles-29407155-jclsd\" (UID: \"a685f890-93f4-40fc-8613-d7c3a331d927\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.377924 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t5t9\" (UniqueName: \"kubernetes.io/projected/a685f890-93f4-40fc-8613-d7c3a331d927-kube-api-access-7t5t9\") pod \"collect-profiles-29407155-jclsd\" (UID: \"a685f890-93f4-40fc-8613-d7c3a331d927\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.378879 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a685f890-93f4-40fc-8613-d7c3a331d927-config-volume\") pod \"collect-profiles-29407155-jclsd\" (UID: \"a685f890-93f4-40fc-8613-d7c3a331d927\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.393227 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a685f890-93f4-40fc-8613-d7c3a331d927-secret-volume\") pod \"collect-profiles-29407155-jclsd\" (UID: \"a685f890-93f4-40fc-8613-d7c3a331d927\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.399967 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t5t9\" (UniqueName: \"kubernetes.io/projected/a685f890-93f4-40fc-8613-d7c3a331d927-kube-api-access-7t5t9\") pod \"collect-profiles-29407155-jclsd\" (UID: \"a685f890-93f4-40fc-8613-d7c3a331d927\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" Nov 29 15:15:00 crc kubenswrapper[4907]: I1129 15:15:00.531975 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" Nov 29 15:15:01 crc kubenswrapper[4907]: I1129 15:15:01.083258 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd"] Nov 29 15:15:01 crc kubenswrapper[4907]: I1129 15:15:01.902594 4907 generic.go:334] "Generic (PLEG): container finished" podID="a685f890-93f4-40fc-8613-d7c3a331d927" containerID="6c401ed14e480d3d503abd95fb383f92378a30298d57d2ba717d04b1eb279dde" exitCode=0 Nov 29 15:15:01 crc kubenswrapper[4907]: I1129 15:15:01.902849 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" event={"ID":"a685f890-93f4-40fc-8613-d7c3a331d927","Type":"ContainerDied","Data":"6c401ed14e480d3d503abd95fb383f92378a30298d57d2ba717d04b1eb279dde"} Nov 29 15:15:01 crc kubenswrapper[4907]: I1129 15:15:01.902883 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" event={"ID":"a685f890-93f4-40fc-8613-d7c3a331d927","Type":"ContainerStarted","Data":"ff9ffc71a4bdf40ff7bdcf912787c5a46960a2136156a78bebbe38b30e330b41"} Nov 29 15:15:03 crc kubenswrapper[4907]: I1129 15:15:03.325161 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" Nov 29 15:15:03 crc kubenswrapper[4907]: I1129 15:15:03.467477 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t5t9\" (UniqueName: \"kubernetes.io/projected/a685f890-93f4-40fc-8613-d7c3a331d927-kube-api-access-7t5t9\") pod \"a685f890-93f4-40fc-8613-d7c3a331d927\" (UID: \"a685f890-93f4-40fc-8613-d7c3a331d927\") " Nov 29 15:15:03 crc kubenswrapper[4907]: I1129 15:15:03.467616 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a685f890-93f4-40fc-8613-d7c3a331d927-config-volume\") pod \"a685f890-93f4-40fc-8613-d7c3a331d927\" (UID: \"a685f890-93f4-40fc-8613-d7c3a331d927\") " Nov 29 15:15:03 crc kubenswrapper[4907]: I1129 15:15:03.467682 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a685f890-93f4-40fc-8613-d7c3a331d927-secret-volume\") pod \"a685f890-93f4-40fc-8613-d7c3a331d927\" (UID: \"a685f890-93f4-40fc-8613-d7c3a331d927\") " Nov 29 15:15:03 crc kubenswrapper[4907]: I1129 15:15:03.468587 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a685f890-93f4-40fc-8613-d7c3a331d927-config-volume" (OuterVolumeSpecName: "config-volume") pod "a685f890-93f4-40fc-8613-d7c3a331d927" (UID: "a685f890-93f4-40fc-8613-d7c3a331d927"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 15:15:03 crc kubenswrapper[4907]: I1129 15:15:03.474556 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a685f890-93f4-40fc-8613-d7c3a331d927-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a685f890-93f4-40fc-8613-d7c3a331d927" (UID: "a685f890-93f4-40fc-8613-d7c3a331d927"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:15:03 crc kubenswrapper[4907]: I1129 15:15:03.479070 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a685f890-93f4-40fc-8613-d7c3a331d927-kube-api-access-7t5t9" (OuterVolumeSpecName: "kube-api-access-7t5t9") pod "a685f890-93f4-40fc-8613-d7c3a331d927" (UID: "a685f890-93f4-40fc-8613-d7c3a331d927"). InnerVolumeSpecName "kube-api-access-7t5t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:15:03 crc kubenswrapper[4907]: I1129 15:15:03.570170 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t5t9\" (UniqueName: \"kubernetes.io/projected/a685f890-93f4-40fc-8613-d7c3a331d927-kube-api-access-7t5t9\") on node \"crc\" DevicePath \"\"" Nov 29 15:15:03 crc kubenswrapper[4907]: I1129 15:15:03.570204 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a685f890-93f4-40fc-8613-d7c3a331d927-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 15:15:03 crc kubenswrapper[4907]: I1129 15:15:03.570213 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a685f890-93f4-40fc-8613-d7c3a331d927-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 15:15:03 crc kubenswrapper[4907]: I1129 15:15:03.936675 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" event={"ID":"a685f890-93f4-40fc-8613-d7c3a331d927","Type":"ContainerDied","Data":"ff9ffc71a4bdf40ff7bdcf912787c5a46960a2136156a78bebbe38b30e330b41"} Nov 29 15:15:03 crc kubenswrapper[4907]: I1129 15:15:03.936958 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff9ffc71a4bdf40ff7bdcf912787c5a46960a2136156a78bebbe38b30e330b41" Nov 29 15:15:03 crc kubenswrapper[4907]: I1129 15:15:03.936765 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd" Nov 29 15:15:04 crc kubenswrapper[4907]: I1129 15:15:04.414265 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh"] Nov 29 15:15:04 crc kubenswrapper[4907]: I1129 15:15:04.427717 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407110-d75hh"] Nov 29 15:15:04 crc kubenswrapper[4907]: I1129 15:15:04.513106 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a259660c-b57b-4a89-9f33-19d3bb3f5a93" path="/var/lib/kubelet/pods/a259660c-b57b-4a89-9f33-19d3bb3f5a93/volumes" Nov 29 15:15:29 crc kubenswrapper[4907]: I1129 15:15:29.474414 4907 scope.go:117] "RemoveContainer" containerID="602193d41e220ce3274d996b139832df814b3285c233c0710393a7ec24970b82" Nov 29 15:16:48 crc kubenswrapper[4907]: I1129 15:16:48.387737 4907 generic.go:334] "Generic (PLEG): container finished" podID="9af53af7-2ded-4b44-92c8-85cb98ea6519" containerID="ef05c9921a7573cd14cb543db14d3a47a595df0e47a505114ddb9d71a01546e5" exitCode=0 Nov 29 15:16:48 crc kubenswrapper[4907]: I1129 15:16:48.387804 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" event={"ID":"9af53af7-2ded-4b44-92c8-85cb98ea6519","Type":"ContainerDied","Data":"ef05c9921a7573cd14cb543db14d3a47a595df0e47a505114ddb9d71a01546e5"} Nov 29 15:16:49 crc kubenswrapper[4907]: I1129 15:16:49.922973 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.125334 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-migration-ssh-key-1\") pod \"9af53af7-2ded-4b44-92c8-85cb98ea6519\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.125511 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-cell1-compute-config-1\") pod \"9af53af7-2ded-4b44-92c8-85cb98ea6519\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.125609 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-extra-config-0\") pod \"9af53af7-2ded-4b44-92c8-85cb98ea6519\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.125673 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-migration-ssh-key-0\") pod \"9af53af7-2ded-4b44-92c8-85cb98ea6519\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.125741 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp8d5\" (UniqueName: \"kubernetes.io/projected/9af53af7-2ded-4b44-92c8-85cb98ea6519-kube-api-access-pp8d5\") pod \"9af53af7-2ded-4b44-92c8-85cb98ea6519\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.125792 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-cell1-compute-config-0\") pod \"9af53af7-2ded-4b44-92c8-85cb98ea6519\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.125820 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-ssh-key\") pod \"9af53af7-2ded-4b44-92c8-85cb98ea6519\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.125866 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-combined-ca-bundle\") pod \"9af53af7-2ded-4b44-92c8-85cb98ea6519\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.125893 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-inventory\") pod \"9af53af7-2ded-4b44-92c8-85cb98ea6519\" (UID: \"9af53af7-2ded-4b44-92c8-85cb98ea6519\") " Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.132173 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9af53af7-2ded-4b44-92c8-85cb98ea6519" (UID: "9af53af7-2ded-4b44-92c8-85cb98ea6519"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.133076 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af53af7-2ded-4b44-92c8-85cb98ea6519-kube-api-access-pp8d5" (OuterVolumeSpecName: "kube-api-access-pp8d5") pod "9af53af7-2ded-4b44-92c8-85cb98ea6519" (UID: "9af53af7-2ded-4b44-92c8-85cb98ea6519"). InnerVolumeSpecName "kube-api-access-pp8d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.168238 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "9af53af7-2ded-4b44-92c8-85cb98ea6519" (UID: "9af53af7-2ded-4b44-92c8-85cb98ea6519"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.172357 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "9af53af7-2ded-4b44-92c8-85cb98ea6519" (UID: "9af53af7-2ded-4b44-92c8-85cb98ea6519"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.175243 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-inventory" (OuterVolumeSpecName: "inventory") pod "9af53af7-2ded-4b44-92c8-85cb98ea6519" (UID: "9af53af7-2ded-4b44-92c8-85cb98ea6519"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.187990 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9af53af7-2ded-4b44-92c8-85cb98ea6519" (UID: "9af53af7-2ded-4b44-92c8-85cb98ea6519"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.190044 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "9af53af7-2ded-4b44-92c8-85cb98ea6519" (UID: "9af53af7-2ded-4b44-92c8-85cb98ea6519"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.191123 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "9af53af7-2ded-4b44-92c8-85cb98ea6519" (UID: "9af53af7-2ded-4b44-92c8-85cb98ea6519"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.191829 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "9af53af7-2ded-4b44-92c8-85cb98ea6519" (UID: "9af53af7-2ded-4b44-92c8-85cb98ea6519"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.229688 4907 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.229734 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.229750 4907 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.229763 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.229778 4907 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.229790 4907 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.229803 4907 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.229814 4907 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9af53af7-2ded-4b44-92c8-85cb98ea6519-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.229826 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp8d5\" (UniqueName: \"kubernetes.io/projected/9af53af7-2ded-4b44-92c8-85cb98ea6519-kube-api-access-pp8d5\") on node \"crc\" DevicePath \"\"" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.425248 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" event={"ID":"9af53af7-2ded-4b44-92c8-85cb98ea6519","Type":"ContainerDied","Data":"b510694b4554ae8ff0989fe6c260fbf2c0d6ab034f103bf021d269db9790d48b"} Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.425299 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8mj5" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.425319 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b510694b4554ae8ff0989fe6c260fbf2c0d6ab034f103bf021d269db9790d48b" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.541249 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4"] Nov 29 15:16:50 crc kubenswrapper[4907]: E1129 15:16:50.542157 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a685f890-93f4-40fc-8613-d7c3a331d927" containerName="collect-profiles" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.542173 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a685f890-93f4-40fc-8613-d7c3a331d927" containerName="collect-profiles" Nov 29 15:16:50 crc kubenswrapper[4907]: E1129 15:16:50.542218 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af53af7-2ded-4b44-92c8-85cb98ea6519" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.542226 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af53af7-2ded-4b44-92c8-85cb98ea6519" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.550464 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af53af7-2ded-4b44-92c8-85cb98ea6519" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.550546 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a685f890-93f4-40fc-8613-d7c3a331d927" containerName="collect-profiles" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.551564 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4"] Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.551662 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.576375 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.576643 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.576975 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.577152 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.577309 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.675511 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.676196 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p27ml\" (UniqueName: \"kubernetes.io/projected/459ecb90-0260-49a5-a146-fb948f9daefb-kube-api-access-p27ml\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.676655 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.676803 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.676968 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.677022 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.677180 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.779424 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.779570 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.780791 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.780880 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.781372 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.781473 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p27ml\" (UniqueName: \"kubernetes.io/projected/459ecb90-0260-49a5-a146-fb948f9daefb-kube-api-access-p27ml\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.781701 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.786266 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.786773 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.787139 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.788166 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.789061 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.790153 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.803037 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p27ml\" (UniqueName: \"kubernetes.io/projected/459ecb90-0260-49a5-a146-fb948f9daefb-kube-api-access-p27ml\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:50 crc kubenswrapper[4907]: I1129 15:16:50.896192 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:16:51 crc kubenswrapper[4907]: I1129 15:16:51.485063 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4"] Nov 29 15:16:52 crc kubenswrapper[4907]: I1129 15:16:52.445012 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" event={"ID":"459ecb90-0260-49a5-a146-fb948f9daefb","Type":"ContainerStarted","Data":"8ffa30c1f6961ee65f1ca937e9092c43512356d2a9deae157f1603ddb578f378"} Nov 29 15:16:52 crc kubenswrapper[4907]: I1129 15:16:52.445275 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" event={"ID":"459ecb90-0260-49a5-a146-fb948f9daefb","Type":"ContainerStarted","Data":"71355addc6a31074dbc2e3666512677842d30d4c605b8e4742a446ca8bf41378"} Nov 29 15:16:52 crc kubenswrapper[4907]: I1129 15:16:52.468073 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" podStartSLOduration=2.013278658 podStartE2EDuration="2.468056276s" podCreationTimestamp="2025-11-29 15:16:50 +0000 UTC" firstStartedPulling="2025-11-29 15:16:51.500506434 +0000 UTC m=+2909.487344116" lastFinishedPulling="2025-11-29 15:16:51.955284052 +0000 UTC m=+2909.942121734" observedRunningTime="2025-11-29 15:16:52.461261094 +0000 UTC m=+2910.448098746" watchObservedRunningTime="2025-11-29 15:16:52.468056276 +0000 UTC m=+2910.454893918" Nov 29 15:16:58 crc kubenswrapper[4907]: I1129 15:16:58.490397 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:16:58 crc kubenswrapper[4907]: I1129 15:16:58.491026 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:17:28 crc kubenswrapper[4907]: I1129 15:17:28.489987 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:17:28 crc kubenswrapper[4907]: I1129 15:17:28.492650 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:17:58 crc kubenswrapper[4907]: I1129 15:17:58.489953 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:17:58 crc kubenswrapper[4907]: I1129 15:17:58.493476 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:17:58 crc kubenswrapper[4907]: I1129 15:17:58.496554 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 15:17:58 crc kubenswrapper[4907]: I1129 15:17:58.497550 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 15:17:58 crc kubenswrapper[4907]: I1129 15:17:58.497642 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" gracePeriod=600 Nov 29 15:17:58 crc kubenswrapper[4907]: E1129 15:17:58.629003 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:17:59 crc kubenswrapper[4907]: I1129 15:17:59.394396 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" exitCode=0 Nov 29 15:17:59 crc kubenswrapper[4907]: I1129 15:17:59.394487 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca"} Nov 29 15:17:59 crc kubenswrapper[4907]: I1129 15:17:59.394579 4907 scope.go:117] "RemoveContainer" containerID="16bb77d19bc18e0759ca15061f56e4df0e7fb5ad0c1ada9c6f037f6c741d3992" Nov 29 15:17:59 crc kubenswrapper[4907]: I1129 15:17:59.395719 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:17:59 crc kubenswrapper[4907]: E1129 15:17:59.396348 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:18:12 crc kubenswrapper[4907]: I1129 15:18:12.490480 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:18:12 crc kubenswrapper[4907]: E1129 15:18:12.491835 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:18:24 crc kubenswrapper[4907]: I1129 15:18:24.480932 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:18:24 crc kubenswrapper[4907]: E1129 15:18:24.482391 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:18:38 crc kubenswrapper[4907]: I1129 15:18:38.480705 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:18:38 crc kubenswrapper[4907]: E1129 15:18:38.481959 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:18:49 crc kubenswrapper[4907]: I1129 15:18:49.479962 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:18:49 crc kubenswrapper[4907]: E1129 15:18:49.481703 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:19:03 crc kubenswrapper[4907]: I1129 15:19:03.479950 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:19:03 crc kubenswrapper[4907]: E1129 15:19:03.482203 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:19:18 crc kubenswrapper[4907]: I1129 15:19:18.479850 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:19:18 crc kubenswrapper[4907]: E1129 15:19:18.480720 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:19:22 crc kubenswrapper[4907]: I1129 15:19:22.991833 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s995r"] Nov 29 15:19:22 crc kubenswrapper[4907]: I1129 15:19:22.997668 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:23 crc kubenswrapper[4907]: I1129 15:19:23.024182 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s995r"] Nov 29 15:19:23 crc kubenswrapper[4907]: I1129 15:19:23.094209 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hkwf\" (UniqueName: \"kubernetes.io/projected/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-kube-api-access-8hkwf\") pod \"redhat-marketplace-s995r\" (UID: \"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01\") " pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:23 crc kubenswrapper[4907]: I1129 15:19:23.094490 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-utilities\") pod \"redhat-marketplace-s995r\" (UID: \"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01\") " pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:23 crc kubenswrapper[4907]: I1129 15:19:23.094658 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-catalog-content\") pod \"redhat-marketplace-s995r\" (UID: \"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01\") " pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:23 crc kubenswrapper[4907]: I1129 15:19:23.197251 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hkwf\" (UniqueName: \"kubernetes.io/projected/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-kube-api-access-8hkwf\") pod \"redhat-marketplace-s995r\" (UID: \"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01\") " pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:23 crc kubenswrapper[4907]: I1129 15:19:23.197372 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-utilities\") pod \"redhat-marketplace-s995r\" (UID: \"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01\") " pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:23 crc kubenswrapper[4907]: I1129 15:19:23.197425 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-catalog-content\") pod \"redhat-marketplace-s995r\" (UID: \"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01\") " pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:23 crc kubenswrapper[4907]: I1129 15:19:23.197973 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-catalog-content\") pod \"redhat-marketplace-s995r\" (UID: \"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01\") " pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:23 crc kubenswrapper[4907]: I1129 15:19:23.198143 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-utilities\") pod \"redhat-marketplace-s995r\" (UID: \"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01\") " pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:23 crc kubenswrapper[4907]: I1129 15:19:23.224419 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hkwf\" (UniqueName: \"kubernetes.io/projected/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-kube-api-access-8hkwf\") pod \"redhat-marketplace-s995r\" (UID: \"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01\") " pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:23 crc kubenswrapper[4907]: I1129 15:19:23.333738 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:23 crc kubenswrapper[4907]: I1129 15:19:23.839148 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s995r"] Nov 29 15:19:24 crc kubenswrapper[4907]: I1129 15:19:24.675476 4907 generic.go:334] "Generic (PLEG): container finished" podID="3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01" containerID="2f728c738756f79f55d5ef863153f4b8a06a1789b5ff04ee8819a86dd63fc933" exitCode=0 Nov 29 15:19:24 crc kubenswrapper[4907]: I1129 15:19:24.675570 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s995r" event={"ID":"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01","Type":"ContainerDied","Data":"2f728c738756f79f55d5ef863153f4b8a06a1789b5ff04ee8819a86dd63fc933"} Nov 29 15:19:24 crc kubenswrapper[4907]: I1129 15:19:24.675785 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s995r" event={"ID":"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01","Type":"ContainerStarted","Data":"4940f6a8c5e4f3aa6e33618eb783c0432f9065225f51a5d89a43b49272b85f47"} Nov 29 15:19:24 crc kubenswrapper[4907]: I1129 15:19:24.679694 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 15:19:26 crc kubenswrapper[4907]: I1129 15:19:26.710694 4907 generic.go:334] "Generic (PLEG): container finished" podID="3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01" containerID="3550843dfbd38113a67c43a0c13690aa33bf0cd434dcdd6ee899dd6d80f074cd" exitCode=0 Nov 29 15:19:26 crc kubenswrapper[4907]: I1129 15:19:26.710746 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s995r" event={"ID":"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01","Type":"ContainerDied","Data":"3550843dfbd38113a67c43a0c13690aa33bf0cd434dcdd6ee899dd6d80f074cd"} Nov 29 15:19:27 crc kubenswrapper[4907]: I1129 15:19:27.722640 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s995r" event={"ID":"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01","Type":"ContainerStarted","Data":"a16033013bf8c9fd6069904aacd6faa61024b21048a84cf76bbe0b018842e753"} Nov 29 15:19:27 crc kubenswrapper[4907]: I1129 15:19:27.747070 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s995r" podStartSLOduration=3.181583493 podStartE2EDuration="5.74705283s" podCreationTimestamp="2025-11-29 15:19:22 +0000 UTC" firstStartedPulling="2025-11-29 15:19:24.679289396 +0000 UTC m=+3062.666127078" lastFinishedPulling="2025-11-29 15:19:27.244758733 +0000 UTC m=+3065.231596415" observedRunningTime="2025-11-29 15:19:27.737785539 +0000 UTC m=+3065.724623191" watchObservedRunningTime="2025-11-29 15:19:27.74705283 +0000 UTC m=+3065.733890482" Nov 29 15:19:29 crc kubenswrapper[4907]: I1129 15:19:29.749603 4907 generic.go:334] "Generic (PLEG): container finished" podID="459ecb90-0260-49a5-a146-fb948f9daefb" containerID="8ffa30c1f6961ee65f1ca937e9092c43512356d2a9deae157f1603ddb578f378" exitCode=0 Nov 29 15:19:29 crc kubenswrapper[4907]: I1129 15:19:29.749706 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" event={"ID":"459ecb90-0260-49a5-a146-fb948f9daefb","Type":"ContainerDied","Data":"8ffa30c1f6961ee65f1ca937e9092c43512356d2a9deae157f1603ddb578f378"} Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.368575 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.442402 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p27ml\" (UniqueName: \"kubernetes.io/projected/459ecb90-0260-49a5-a146-fb948f9daefb-kube-api-access-p27ml\") pod \"459ecb90-0260-49a5-a146-fb948f9daefb\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.442486 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-1\") pod \"459ecb90-0260-49a5-a146-fb948f9daefb\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.442517 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-0\") pod \"459ecb90-0260-49a5-a146-fb948f9daefb\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.442574 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ssh-key\") pod \"459ecb90-0260-49a5-a146-fb948f9daefb\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.442659 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-inventory\") pod \"459ecb90-0260-49a5-a146-fb948f9daefb\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.442713 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-telemetry-combined-ca-bundle\") pod \"459ecb90-0260-49a5-a146-fb948f9daefb\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.442760 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-2\") pod \"459ecb90-0260-49a5-a146-fb948f9daefb\" (UID: \"459ecb90-0260-49a5-a146-fb948f9daefb\") " Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.449727 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "459ecb90-0260-49a5-a146-fb948f9daefb" (UID: "459ecb90-0260-49a5-a146-fb948f9daefb"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.450644 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/459ecb90-0260-49a5-a146-fb948f9daefb-kube-api-access-p27ml" (OuterVolumeSpecName: "kube-api-access-p27ml") pod "459ecb90-0260-49a5-a146-fb948f9daefb" (UID: "459ecb90-0260-49a5-a146-fb948f9daefb"). InnerVolumeSpecName "kube-api-access-p27ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.478606 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-inventory" (OuterVolumeSpecName: "inventory") pod "459ecb90-0260-49a5-a146-fb948f9daefb" (UID: "459ecb90-0260-49a5-a146-fb948f9daefb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.480531 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "459ecb90-0260-49a5-a146-fb948f9daefb" (UID: "459ecb90-0260-49a5-a146-fb948f9daefb"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.482168 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "459ecb90-0260-49a5-a146-fb948f9daefb" (UID: "459ecb90-0260-49a5-a146-fb948f9daefb"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.482892 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "459ecb90-0260-49a5-a146-fb948f9daefb" (UID: "459ecb90-0260-49a5-a146-fb948f9daefb"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.484613 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "459ecb90-0260-49a5-a146-fb948f9daefb" (UID: "459ecb90-0260-49a5-a146-fb948f9daefb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.547276 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.547309 4907 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.547324 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.547335 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p27ml\" (UniqueName: \"kubernetes.io/projected/459ecb90-0260-49a5-a146-fb948f9daefb-kube-api-access-p27ml\") on node \"crc\" DevicePath \"\"" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.547346 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.547356 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.547366 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/459ecb90-0260-49a5-a146-fb948f9daefb-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.783141 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" event={"ID":"459ecb90-0260-49a5-a146-fb948f9daefb","Type":"ContainerDied","Data":"71355addc6a31074dbc2e3666512677842d30d4c605b8e4742a446ca8bf41378"} Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.783199 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71355addc6a31074dbc2e3666512677842d30d4c605b8e4742a446ca8bf41378" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.783201 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.929120 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b"] Nov 29 15:19:31 crc kubenswrapper[4907]: E1129 15:19:31.929695 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459ecb90-0260-49a5-a146-fb948f9daefb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.929726 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="459ecb90-0260-49a5-a146-fb948f9daefb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.930072 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="459ecb90-0260-49a5-a146-fb948f9daefb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.931023 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.935980 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.936515 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.936803 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.937586 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.948126 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 15:19:31 crc kubenswrapper[4907]: I1129 15:19:31.973093 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b"] Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.081324 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.081372 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.081431 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2npjk\" (UniqueName: \"kubernetes.io/projected/3ce1e052-0764-4391-9bb8-149e06b8744a-kube-api-access-2npjk\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.081602 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.081626 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.081655 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.081688 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.185254 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.185334 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.185601 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2npjk\" (UniqueName: \"kubernetes.io/projected/3ce1e052-0764-4391-9bb8-149e06b8744a-kube-api-access-2npjk\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.185770 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.185814 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.185872 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.185942 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.191805 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.191811 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.191970 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.193827 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.197377 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.199045 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.211984 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2npjk\" (UniqueName: \"kubernetes.io/projected/3ce1e052-0764-4391-9bb8-149e06b8744a-kube-api-access-2npjk\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.254948 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:19:32 crc kubenswrapper[4907]: I1129 15:19:32.915120 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b"] Nov 29 15:19:33 crc kubenswrapper[4907]: I1129 15:19:33.334520 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:33 crc kubenswrapper[4907]: I1129 15:19:33.334583 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:33 crc kubenswrapper[4907]: I1129 15:19:33.479860 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:19:33 crc kubenswrapper[4907]: E1129 15:19:33.480506 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:19:33 crc kubenswrapper[4907]: I1129 15:19:33.483338 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:33 crc kubenswrapper[4907]: I1129 15:19:33.804514 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" event={"ID":"3ce1e052-0764-4391-9bb8-149e06b8744a","Type":"ContainerStarted","Data":"ce8245f9cc1424fc0ecc8b1a1ea3c70d939f12dd27473d2f73bf8cf4a75b879d"} Nov 29 15:19:33 crc kubenswrapper[4907]: I1129 15:19:33.898412 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:33 crc kubenswrapper[4907]: I1129 15:19:33.958294 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s995r"] Nov 29 15:19:34 crc kubenswrapper[4907]: I1129 15:19:34.819591 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" event={"ID":"3ce1e052-0764-4391-9bb8-149e06b8744a","Type":"ContainerStarted","Data":"f5e3ebc3fc42ab62bc6b2669a16520374f927b7621eefb1daa7f2fbab42f6e85"} Nov 29 15:19:34 crc kubenswrapper[4907]: I1129 15:19:34.864587 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" podStartSLOduration=3.1448930920000002 podStartE2EDuration="3.864564733s" podCreationTimestamp="2025-11-29 15:19:31 +0000 UTC" firstStartedPulling="2025-11-29 15:19:32.921759891 +0000 UTC m=+3070.908597553" lastFinishedPulling="2025-11-29 15:19:33.641431522 +0000 UTC m=+3071.628269194" observedRunningTime="2025-11-29 15:19:34.844178208 +0000 UTC m=+3072.831015900" watchObservedRunningTime="2025-11-29 15:19:34.864564733 +0000 UTC m=+3072.851402395" Nov 29 15:19:35 crc kubenswrapper[4907]: I1129 15:19:35.831817 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s995r" podUID="3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01" containerName="registry-server" containerID="cri-o://a16033013bf8c9fd6069904aacd6faa61024b21048a84cf76bbe0b018842e753" gracePeriod=2 Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.422178 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.509214 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-utilities\") pod \"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01\" (UID: \"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01\") " Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.509636 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hkwf\" (UniqueName: \"kubernetes.io/projected/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-kube-api-access-8hkwf\") pod \"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01\" (UID: \"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01\") " Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.509829 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-catalog-content\") pod \"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01\" (UID: \"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01\") " Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.510111 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-utilities" (OuterVolumeSpecName: "utilities") pod "3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01" (UID: "3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.514321 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.519420 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-kube-api-access-8hkwf" (OuterVolumeSpecName: "kube-api-access-8hkwf") pod "3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01" (UID: "3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01"). InnerVolumeSpecName "kube-api-access-8hkwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.538640 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01" (UID: "3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.616886 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hkwf\" (UniqueName: \"kubernetes.io/projected/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-kube-api-access-8hkwf\") on node \"crc\" DevicePath \"\"" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.616944 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.845799 4907 generic.go:334] "Generic (PLEG): container finished" podID="3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01" containerID="a16033013bf8c9fd6069904aacd6faa61024b21048a84cf76bbe0b018842e753" exitCode=0 Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.845850 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s995r" event={"ID":"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01","Type":"ContainerDied","Data":"a16033013bf8c9fd6069904aacd6faa61024b21048a84cf76bbe0b018842e753"} Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.845897 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s995r" event={"ID":"3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01","Type":"ContainerDied","Data":"4940f6a8c5e4f3aa6e33618eb783c0432f9065225f51a5d89a43b49272b85f47"} Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.845917 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s995r" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.845950 4907 scope.go:117] "RemoveContainer" containerID="a16033013bf8c9fd6069904aacd6faa61024b21048a84cf76bbe0b018842e753" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.887926 4907 scope.go:117] "RemoveContainer" containerID="3550843dfbd38113a67c43a0c13690aa33bf0cd434dcdd6ee899dd6d80f074cd" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.907420 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s995r"] Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.918257 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s995r"] Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.921563 4907 scope.go:117] "RemoveContainer" containerID="2f728c738756f79f55d5ef863153f4b8a06a1789b5ff04ee8819a86dd63fc933" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.995946 4907 scope.go:117] "RemoveContainer" containerID="a16033013bf8c9fd6069904aacd6faa61024b21048a84cf76bbe0b018842e753" Nov 29 15:19:36 crc kubenswrapper[4907]: E1129 15:19:36.996518 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a16033013bf8c9fd6069904aacd6faa61024b21048a84cf76bbe0b018842e753\": container with ID starting with a16033013bf8c9fd6069904aacd6faa61024b21048a84cf76bbe0b018842e753 not found: ID does not exist" containerID="a16033013bf8c9fd6069904aacd6faa61024b21048a84cf76bbe0b018842e753" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.996557 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a16033013bf8c9fd6069904aacd6faa61024b21048a84cf76bbe0b018842e753"} err="failed to get container status \"a16033013bf8c9fd6069904aacd6faa61024b21048a84cf76bbe0b018842e753\": rpc error: code = NotFound desc = could not find container \"a16033013bf8c9fd6069904aacd6faa61024b21048a84cf76bbe0b018842e753\": container with ID starting with a16033013bf8c9fd6069904aacd6faa61024b21048a84cf76bbe0b018842e753 not found: ID does not exist" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.996610 4907 scope.go:117] "RemoveContainer" containerID="3550843dfbd38113a67c43a0c13690aa33bf0cd434dcdd6ee899dd6d80f074cd" Nov 29 15:19:36 crc kubenswrapper[4907]: E1129 15:19:36.996980 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3550843dfbd38113a67c43a0c13690aa33bf0cd434dcdd6ee899dd6d80f074cd\": container with ID starting with 3550843dfbd38113a67c43a0c13690aa33bf0cd434dcdd6ee899dd6d80f074cd not found: ID does not exist" containerID="3550843dfbd38113a67c43a0c13690aa33bf0cd434dcdd6ee899dd6d80f074cd" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.997087 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3550843dfbd38113a67c43a0c13690aa33bf0cd434dcdd6ee899dd6d80f074cd"} err="failed to get container status \"3550843dfbd38113a67c43a0c13690aa33bf0cd434dcdd6ee899dd6d80f074cd\": rpc error: code = NotFound desc = could not find container \"3550843dfbd38113a67c43a0c13690aa33bf0cd434dcdd6ee899dd6d80f074cd\": container with ID starting with 3550843dfbd38113a67c43a0c13690aa33bf0cd434dcdd6ee899dd6d80f074cd not found: ID does not exist" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.997193 4907 scope.go:117] "RemoveContainer" containerID="2f728c738756f79f55d5ef863153f4b8a06a1789b5ff04ee8819a86dd63fc933" Nov 29 15:19:36 crc kubenswrapper[4907]: E1129 15:19:36.997669 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f728c738756f79f55d5ef863153f4b8a06a1789b5ff04ee8819a86dd63fc933\": container with ID starting with 2f728c738756f79f55d5ef863153f4b8a06a1789b5ff04ee8819a86dd63fc933 not found: ID does not exist" containerID="2f728c738756f79f55d5ef863153f4b8a06a1789b5ff04ee8819a86dd63fc933" Nov 29 15:19:36 crc kubenswrapper[4907]: I1129 15:19:36.997718 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f728c738756f79f55d5ef863153f4b8a06a1789b5ff04ee8819a86dd63fc933"} err="failed to get container status \"2f728c738756f79f55d5ef863153f4b8a06a1789b5ff04ee8819a86dd63fc933\": rpc error: code = NotFound desc = could not find container \"2f728c738756f79f55d5ef863153f4b8a06a1789b5ff04ee8819a86dd63fc933\": container with ID starting with 2f728c738756f79f55d5ef863153f4b8a06a1789b5ff04ee8819a86dd63fc933 not found: ID does not exist" Nov 29 15:19:38 crc kubenswrapper[4907]: I1129 15:19:38.497151 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01" path="/var/lib/kubelet/pods/3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01/volumes" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.028276 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5q48n"] Nov 29 15:19:46 crc kubenswrapper[4907]: E1129 15:19:46.029772 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01" containerName="extract-content" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.029795 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01" containerName="extract-content" Nov 29 15:19:46 crc kubenswrapper[4907]: E1129 15:19:46.029839 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01" containerName="extract-utilities" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.029852 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01" containerName="extract-utilities" Nov 29 15:19:46 crc kubenswrapper[4907]: E1129 15:19:46.030054 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01" containerName="registry-server" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.030067 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01" containerName="registry-server" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.030619 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a63cd3b-8a03-46e1-aeca-85f1e3c4dc01" containerName="registry-server" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.033878 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.065762 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5q48n"] Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.111481 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae43316-5ce6-4481-ba22-55bffc8fdfad-catalog-content\") pod \"certified-operators-5q48n\" (UID: \"5ae43316-5ce6-4481-ba22-55bffc8fdfad\") " pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.111570 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsf4k\" (UniqueName: \"kubernetes.io/projected/5ae43316-5ce6-4481-ba22-55bffc8fdfad-kube-api-access-gsf4k\") pod \"certified-operators-5q48n\" (UID: \"5ae43316-5ce6-4481-ba22-55bffc8fdfad\") " pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.111619 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae43316-5ce6-4481-ba22-55bffc8fdfad-utilities\") pod \"certified-operators-5q48n\" (UID: \"5ae43316-5ce6-4481-ba22-55bffc8fdfad\") " pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.214274 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae43316-5ce6-4481-ba22-55bffc8fdfad-catalog-content\") pod \"certified-operators-5q48n\" (UID: \"5ae43316-5ce6-4481-ba22-55bffc8fdfad\") " pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.214714 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsf4k\" (UniqueName: \"kubernetes.io/projected/5ae43316-5ce6-4481-ba22-55bffc8fdfad-kube-api-access-gsf4k\") pod \"certified-operators-5q48n\" (UID: \"5ae43316-5ce6-4481-ba22-55bffc8fdfad\") " pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.214745 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae43316-5ce6-4481-ba22-55bffc8fdfad-utilities\") pod \"certified-operators-5q48n\" (UID: \"5ae43316-5ce6-4481-ba22-55bffc8fdfad\") " pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.214849 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae43316-5ce6-4481-ba22-55bffc8fdfad-catalog-content\") pod \"certified-operators-5q48n\" (UID: \"5ae43316-5ce6-4481-ba22-55bffc8fdfad\") " pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.215268 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae43316-5ce6-4481-ba22-55bffc8fdfad-utilities\") pod \"certified-operators-5q48n\" (UID: \"5ae43316-5ce6-4481-ba22-55bffc8fdfad\") " pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.235987 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsf4k\" (UniqueName: \"kubernetes.io/projected/5ae43316-5ce6-4481-ba22-55bffc8fdfad-kube-api-access-gsf4k\") pod \"certified-operators-5q48n\" (UID: \"5ae43316-5ce6-4481-ba22-55bffc8fdfad\") " pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.364698 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.886229 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5q48n"] Nov 29 15:19:46 crc kubenswrapper[4907]: I1129 15:19:46.989300 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q48n" event={"ID":"5ae43316-5ce6-4481-ba22-55bffc8fdfad","Type":"ContainerStarted","Data":"5cd84fd685f6c31f2adba2e6b1fa4fc4bc888246754286b695353eec11763e4e"} Nov 29 15:19:48 crc kubenswrapper[4907]: I1129 15:19:48.008765 4907 generic.go:334] "Generic (PLEG): container finished" podID="5ae43316-5ce6-4481-ba22-55bffc8fdfad" containerID="2f441d9f3935d41e499703d13a55d71610a4b05c51f6b3ec1efc99a62d5f56be" exitCode=0 Nov 29 15:19:48 crc kubenswrapper[4907]: I1129 15:19:48.008820 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q48n" event={"ID":"5ae43316-5ce6-4481-ba22-55bffc8fdfad","Type":"ContainerDied","Data":"2f441d9f3935d41e499703d13a55d71610a4b05c51f6b3ec1efc99a62d5f56be"} Nov 29 15:19:48 crc kubenswrapper[4907]: I1129 15:19:48.480129 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:19:48 crc kubenswrapper[4907]: E1129 15:19:48.480735 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:19:49 crc kubenswrapper[4907]: I1129 15:19:49.025876 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q48n" event={"ID":"5ae43316-5ce6-4481-ba22-55bffc8fdfad","Type":"ContainerStarted","Data":"54b0127cf0e272b5f0260e83d88fc6e50b162bd2e00868ce22a2ca92acf1e0a4"} Nov 29 15:19:50 crc kubenswrapper[4907]: I1129 15:19:50.042233 4907 generic.go:334] "Generic (PLEG): container finished" podID="5ae43316-5ce6-4481-ba22-55bffc8fdfad" containerID="54b0127cf0e272b5f0260e83d88fc6e50b162bd2e00868ce22a2ca92acf1e0a4" exitCode=0 Nov 29 15:19:50 crc kubenswrapper[4907]: I1129 15:19:50.042367 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q48n" event={"ID":"5ae43316-5ce6-4481-ba22-55bffc8fdfad","Type":"ContainerDied","Data":"54b0127cf0e272b5f0260e83d88fc6e50b162bd2e00868ce22a2ca92acf1e0a4"} Nov 29 15:19:51 crc kubenswrapper[4907]: I1129 15:19:51.064524 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q48n" event={"ID":"5ae43316-5ce6-4481-ba22-55bffc8fdfad","Type":"ContainerStarted","Data":"aaaacb40a588e5adaad5f41c6d5f3263b55c6c4e72898e7395ec35246cff8cfd"} Nov 29 15:19:51 crc kubenswrapper[4907]: I1129 15:19:51.114295 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5q48n" podStartSLOduration=3.515195137 podStartE2EDuration="6.114268004s" podCreationTimestamp="2025-11-29 15:19:45 +0000 UTC" firstStartedPulling="2025-11-29 15:19:48.013116196 +0000 UTC m=+3085.999953888" lastFinishedPulling="2025-11-29 15:19:50.612189063 +0000 UTC m=+3088.599026755" observedRunningTime="2025-11-29 15:19:51.094967039 +0000 UTC m=+3089.081804721" watchObservedRunningTime="2025-11-29 15:19:51.114268004 +0000 UTC m=+3089.101105696" Nov 29 15:19:56 crc kubenswrapper[4907]: I1129 15:19:56.365865 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:56 crc kubenswrapper[4907]: I1129 15:19:56.366666 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:56 crc kubenswrapper[4907]: I1129 15:19:56.439918 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:57 crc kubenswrapper[4907]: I1129 15:19:57.235964 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:57 crc kubenswrapper[4907]: I1129 15:19:57.303427 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5q48n"] Nov 29 15:19:59 crc kubenswrapper[4907]: I1129 15:19:59.164417 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5q48n" podUID="5ae43316-5ce6-4481-ba22-55bffc8fdfad" containerName="registry-server" containerID="cri-o://aaaacb40a588e5adaad5f41c6d5f3263b55c6c4e72898e7395ec35246cff8cfd" gracePeriod=2 Nov 29 15:19:59 crc kubenswrapper[4907]: I1129 15:19:59.837800 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:19:59 crc kubenswrapper[4907]: I1129 15:19:59.910499 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae43316-5ce6-4481-ba22-55bffc8fdfad-catalog-content\") pod \"5ae43316-5ce6-4481-ba22-55bffc8fdfad\" (UID: \"5ae43316-5ce6-4481-ba22-55bffc8fdfad\") " Nov 29 15:19:59 crc kubenswrapper[4907]: I1129 15:19:59.910622 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsf4k\" (UniqueName: \"kubernetes.io/projected/5ae43316-5ce6-4481-ba22-55bffc8fdfad-kube-api-access-gsf4k\") pod \"5ae43316-5ce6-4481-ba22-55bffc8fdfad\" (UID: \"5ae43316-5ce6-4481-ba22-55bffc8fdfad\") " Nov 29 15:19:59 crc kubenswrapper[4907]: I1129 15:19:59.910711 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae43316-5ce6-4481-ba22-55bffc8fdfad-utilities\") pod \"5ae43316-5ce6-4481-ba22-55bffc8fdfad\" (UID: \"5ae43316-5ce6-4481-ba22-55bffc8fdfad\") " Nov 29 15:19:59 crc kubenswrapper[4907]: I1129 15:19:59.912357 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ae43316-5ce6-4481-ba22-55bffc8fdfad-utilities" (OuterVolumeSpecName: "utilities") pod "5ae43316-5ce6-4481-ba22-55bffc8fdfad" (UID: "5ae43316-5ce6-4481-ba22-55bffc8fdfad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:19:59 crc kubenswrapper[4907]: I1129 15:19:59.916642 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ae43316-5ce6-4481-ba22-55bffc8fdfad-kube-api-access-gsf4k" (OuterVolumeSpecName: "kube-api-access-gsf4k") pod "5ae43316-5ce6-4481-ba22-55bffc8fdfad" (UID: "5ae43316-5ce6-4481-ba22-55bffc8fdfad"). InnerVolumeSpecName "kube-api-access-gsf4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:19:59 crc kubenswrapper[4907]: I1129 15:19:59.957999 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ae43316-5ce6-4481-ba22-55bffc8fdfad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ae43316-5ce6-4481-ba22-55bffc8fdfad" (UID: "5ae43316-5ce6-4481-ba22-55bffc8fdfad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.013649 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae43316-5ce6-4481-ba22-55bffc8fdfad-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.013684 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsf4k\" (UniqueName: \"kubernetes.io/projected/5ae43316-5ce6-4481-ba22-55bffc8fdfad-kube-api-access-gsf4k\") on node \"crc\" DevicePath \"\"" Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.013696 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae43316-5ce6-4481-ba22-55bffc8fdfad-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.184492 4907 generic.go:334] "Generic (PLEG): container finished" podID="5ae43316-5ce6-4481-ba22-55bffc8fdfad" containerID="aaaacb40a588e5adaad5f41c6d5f3263b55c6c4e72898e7395ec35246cff8cfd" exitCode=0 Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.184564 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q48n" event={"ID":"5ae43316-5ce6-4481-ba22-55bffc8fdfad","Type":"ContainerDied","Data":"aaaacb40a588e5adaad5f41c6d5f3263b55c6c4e72898e7395ec35246cff8cfd"} Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.184602 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5q48n" Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.184653 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q48n" event={"ID":"5ae43316-5ce6-4481-ba22-55bffc8fdfad","Type":"ContainerDied","Data":"5cd84fd685f6c31f2adba2e6b1fa4fc4bc888246754286b695353eec11763e4e"} Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.184686 4907 scope.go:117] "RemoveContainer" containerID="aaaacb40a588e5adaad5f41c6d5f3263b55c6c4e72898e7395ec35246cff8cfd" Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.220754 4907 scope.go:117] "RemoveContainer" containerID="54b0127cf0e272b5f0260e83d88fc6e50b162bd2e00868ce22a2ca92acf1e0a4" Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.258988 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5q48n"] Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.271756 4907 scope.go:117] "RemoveContainer" containerID="2f441d9f3935d41e499703d13a55d71610a4b05c51f6b3ec1efc99a62d5f56be" Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.275143 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5q48n"] Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.353779 4907 scope.go:117] "RemoveContainer" containerID="aaaacb40a588e5adaad5f41c6d5f3263b55c6c4e72898e7395ec35246cff8cfd" Nov 29 15:20:00 crc kubenswrapper[4907]: E1129 15:20:00.354278 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaaacb40a588e5adaad5f41c6d5f3263b55c6c4e72898e7395ec35246cff8cfd\": container with ID starting with aaaacb40a588e5adaad5f41c6d5f3263b55c6c4e72898e7395ec35246cff8cfd not found: ID does not exist" containerID="aaaacb40a588e5adaad5f41c6d5f3263b55c6c4e72898e7395ec35246cff8cfd" Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.354328 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaaacb40a588e5adaad5f41c6d5f3263b55c6c4e72898e7395ec35246cff8cfd"} err="failed to get container status \"aaaacb40a588e5adaad5f41c6d5f3263b55c6c4e72898e7395ec35246cff8cfd\": rpc error: code = NotFound desc = could not find container \"aaaacb40a588e5adaad5f41c6d5f3263b55c6c4e72898e7395ec35246cff8cfd\": container with ID starting with aaaacb40a588e5adaad5f41c6d5f3263b55c6c4e72898e7395ec35246cff8cfd not found: ID does not exist" Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.354361 4907 scope.go:117] "RemoveContainer" containerID="54b0127cf0e272b5f0260e83d88fc6e50b162bd2e00868ce22a2ca92acf1e0a4" Nov 29 15:20:00 crc kubenswrapper[4907]: E1129 15:20:00.354700 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54b0127cf0e272b5f0260e83d88fc6e50b162bd2e00868ce22a2ca92acf1e0a4\": container with ID starting with 54b0127cf0e272b5f0260e83d88fc6e50b162bd2e00868ce22a2ca92acf1e0a4 not found: ID does not exist" containerID="54b0127cf0e272b5f0260e83d88fc6e50b162bd2e00868ce22a2ca92acf1e0a4" Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.354741 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54b0127cf0e272b5f0260e83d88fc6e50b162bd2e00868ce22a2ca92acf1e0a4"} err="failed to get container status \"54b0127cf0e272b5f0260e83d88fc6e50b162bd2e00868ce22a2ca92acf1e0a4\": rpc error: code = NotFound desc = could not find container \"54b0127cf0e272b5f0260e83d88fc6e50b162bd2e00868ce22a2ca92acf1e0a4\": container with ID starting with 54b0127cf0e272b5f0260e83d88fc6e50b162bd2e00868ce22a2ca92acf1e0a4 not found: ID does not exist" Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.354783 4907 scope.go:117] "RemoveContainer" containerID="2f441d9f3935d41e499703d13a55d71610a4b05c51f6b3ec1efc99a62d5f56be" Nov 29 15:20:00 crc kubenswrapper[4907]: E1129 15:20:00.355427 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f441d9f3935d41e499703d13a55d71610a4b05c51f6b3ec1efc99a62d5f56be\": container with ID starting with 2f441d9f3935d41e499703d13a55d71610a4b05c51f6b3ec1efc99a62d5f56be not found: ID does not exist" containerID="2f441d9f3935d41e499703d13a55d71610a4b05c51f6b3ec1efc99a62d5f56be" Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.355670 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f441d9f3935d41e499703d13a55d71610a4b05c51f6b3ec1efc99a62d5f56be"} err="failed to get container status \"2f441d9f3935d41e499703d13a55d71610a4b05c51f6b3ec1efc99a62d5f56be\": rpc error: code = NotFound desc = could not find container \"2f441d9f3935d41e499703d13a55d71610a4b05c51f6b3ec1efc99a62d5f56be\": container with ID starting with 2f441d9f3935d41e499703d13a55d71610a4b05c51f6b3ec1efc99a62d5f56be not found: ID does not exist" Nov 29 15:20:00 crc kubenswrapper[4907]: I1129 15:20:00.515120 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ae43316-5ce6-4481-ba22-55bffc8fdfad" path="/var/lib/kubelet/pods/5ae43316-5ce6-4481-ba22-55bffc8fdfad/volumes" Nov 29 15:20:03 crc kubenswrapper[4907]: I1129 15:20:03.479865 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:20:03 crc kubenswrapper[4907]: E1129 15:20:03.480690 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:20:15 crc kubenswrapper[4907]: I1129 15:20:15.479487 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:20:15 crc kubenswrapper[4907]: E1129 15:20:15.480340 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:20:27 crc kubenswrapper[4907]: I1129 15:20:27.481529 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:20:27 crc kubenswrapper[4907]: E1129 15:20:27.484187 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:20:38 crc kubenswrapper[4907]: I1129 15:20:38.480547 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:20:38 crc kubenswrapper[4907]: E1129 15:20:38.481282 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:20:49 crc kubenswrapper[4907]: I1129 15:20:49.479815 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:20:49 crc kubenswrapper[4907]: E1129 15:20:49.480732 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:21:04 crc kubenswrapper[4907]: I1129 15:21:04.497203 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:21:04 crc kubenswrapper[4907]: E1129 15:21:04.499152 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:21:17 crc kubenswrapper[4907]: I1129 15:21:17.481151 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:21:17 crc kubenswrapper[4907]: E1129 15:21:17.481802 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:21:29 crc kubenswrapper[4907]: I1129 15:21:29.480914 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:21:29 crc kubenswrapper[4907]: E1129 15:21:29.482659 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:21:43 crc kubenswrapper[4907]: I1129 15:21:43.480663 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:21:43 crc kubenswrapper[4907]: E1129 15:21:43.481831 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:21:50 crc kubenswrapper[4907]: I1129 15:21:50.746323 4907 generic.go:334] "Generic (PLEG): container finished" podID="3ce1e052-0764-4391-9bb8-149e06b8744a" containerID="f5e3ebc3fc42ab62bc6b2669a16520374f927b7621eefb1daa7f2fbab42f6e85" exitCode=0 Nov 29 15:21:50 crc kubenswrapper[4907]: I1129 15:21:50.746429 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" event={"ID":"3ce1e052-0764-4391-9bb8-149e06b8744a","Type":"ContainerDied","Data":"f5e3ebc3fc42ab62bc6b2669a16520374f927b7621eefb1daa7f2fbab42f6e85"} Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.337076 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.475185 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2npjk\" (UniqueName: \"kubernetes.io/projected/3ce1e052-0764-4391-9bb8-149e06b8744a-kube-api-access-2npjk\") pod \"3ce1e052-0764-4391-9bb8-149e06b8744a\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.475582 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-inventory\") pod \"3ce1e052-0764-4391-9bb8-149e06b8744a\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.475784 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-telemetry-power-monitoring-combined-ca-bundle\") pod \"3ce1e052-0764-4391-9bb8-149e06b8744a\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.475939 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-2\") pod \"3ce1e052-0764-4391-9bb8-149e06b8744a\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.476010 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-1\") pod \"3ce1e052-0764-4391-9bb8-149e06b8744a\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.476055 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ssh-key\") pod \"3ce1e052-0764-4391-9bb8-149e06b8744a\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.476100 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-0\") pod \"3ce1e052-0764-4391-9bb8-149e06b8744a\" (UID: \"3ce1e052-0764-4391-9bb8-149e06b8744a\") " Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.483662 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "3ce1e052-0764-4391-9bb8-149e06b8744a" (UID: "3ce1e052-0764-4391-9bb8-149e06b8744a"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.501237 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ce1e052-0764-4391-9bb8-149e06b8744a-kube-api-access-2npjk" (OuterVolumeSpecName: "kube-api-access-2npjk") pod "3ce1e052-0764-4391-9bb8-149e06b8744a" (UID: "3ce1e052-0764-4391-9bb8-149e06b8744a"). InnerVolumeSpecName "kube-api-access-2npjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.510659 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3ce1e052-0764-4391-9bb8-149e06b8744a" (UID: "3ce1e052-0764-4391-9bb8-149e06b8744a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.513805 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "3ce1e052-0764-4391-9bb8-149e06b8744a" (UID: "3ce1e052-0764-4391-9bb8-149e06b8744a"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.528394 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "3ce1e052-0764-4391-9bb8-149e06b8744a" (UID: "3ce1e052-0764-4391-9bb8-149e06b8744a"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.546169 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "3ce1e052-0764-4391-9bb8-149e06b8744a" (UID: "3ce1e052-0764-4391-9bb8-149e06b8744a"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.552911 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-inventory" (OuterVolumeSpecName: "inventory") pod "3ce1e052-0764-4391-9bb8-149e06b8744a" (UID: "3ce1e052-0764-4391-9bb8-149e06b8744a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.578480 4907 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.579333 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.579421 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.579517 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.579571 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.579632 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2npjk\" (UniqueName: \"kubernetes.io/projected/3ce1e052-0764-4391-9bb8-149e06b8744a-kube-api-access-2npjk\") on node \"crc\" DevicePath \"\"" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.579690 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ce1e052-0764-4391-9bb8-149e06b8744a-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.798644 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" event={"ID":"3ce1e052-0764-4391-9bb8-149e06b8744a","Type":"ContainerDied","Data":"ce8245f9cc1424fc0ecc8b1a1ea3c70d939f12dd27473d2f73bf8cf4a75b879d"} Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.798689 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce8245f9cc1424fc0ecc8b1a1ea3c70d939f12dd27473d2f73bf8cf4a75b879d" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.798750 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.910642 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg"] Nov 29 15:21:52 crc kubenswrapper[4907]: E1129 15:21:52.911303 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce1e052-0764-4391-9bb8-149e06b8744a" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.911329 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce1e052-0764-4391-9bb8-149e06b8744a" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Nov 29 15:21:52 crc kubenswrapper[4907]: E1129 15:21:52.911370 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae43316-5ce6-4481-ba22-55bffc8fdfad" containerName="registry-server" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.911380 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae43316-5ce6-4481-ba22-55bffc8fdfad" containerName="registry-server" Nov 29 15:21:52 crc kubenswrapper[4907]: E1129 15:21:52.911405 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae43316-5ce6-4481-ba22-55bffc8fdfad" containerName="extract-utilities" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.911414 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae43316-5ce6-4481-ba22-55bffc8fdfad" containerName="extract-utilities" Nov 29 15:21:52 crc kubenswrapper[4907]: E1129 15:21:52.911467 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae43316-5ce6-4481-ba22-55bffc8fdfad" containerName="extract-content" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.911477 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae43316-5ce6-4481-ba22-55bffc8fdfad" containerName="extract-content" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.911759 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce1e052-0764-4391-9bb8-149e06b8744a" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.911799 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ae43316-5ce6-4481-ba22-55bffc8fdfad" containerName="registry-server" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.912825 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.915530 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.915622 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-chzkp" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.915715 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.916502 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.917146 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.926852 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg"] Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.992266 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tmrrg\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.992532 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdgvl\" (UniqueName: \"kubernetes.io/projected/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-kube-api-access-zdgvl\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tmrrg\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.992568 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tmrrg\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.992629 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tmrrg\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:52 crc kubenswrapper[4907]: I1129 15:21:52.992689 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tmrrg\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:53 crc kubenswrapper[4907]: I1129 15:21:53.094166 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdgvl\" (UniqueName: \"kubernetes.io/projected/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-kube-api-access-zdgvl\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tmrrg\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:53 crc kubenswrapper[4907]: I1129 15:21:53.094212 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tmrrg\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:53 crc kubenswrapper[4907]: I1129 15:21:53.094260 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tmrrg\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:53 crc kubenswrapper[4907]: I1129 15:21:53.094304 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tmrrg\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:53 crc kubenswrapper[4907]: I1129 15:21:53.094372 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tmrrg\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:53 crc kubenswrapper[4907]: I1129 15:21:53.099696 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tmrrg\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:53 crc kubenswrapper[4907]: I1129 15:21:53.099696 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tmrrg\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:53 crc kubenswrapper[4907]: I1129 15:21:53.100275 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tmrrg\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:53 crc kubenswrapper[4907]: I1129 15:21:53.100521 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tmrrg\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:53 crc kubenswrapper[4907]: I1129 15:21:53.118991 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdgvl\" (UniqueName: \"kubernetes.io/projected/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-kube-api-access-zdgvl\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tmrrg\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:53 crc kubenswrapper[4907]: I1129 15:21:53.238129 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:21:53 crc kubenswrapper[4907]: I1129 15:21:53.666693 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg"] Nov 29 15:21:53 crc kubenswrapper[4907]: I1129 15:21:53.807500 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" event={"ID":"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5","Type":"ContainerStarted","Data":"ed737ebd562ea03cd0f506b61de6e0d7f1086bf76a3cb0767d7480f1a4d55381"} Nov 29 15:21:54 crc kubenswrapper[4907]: I1129 15:21:54.818790 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" event={"ID":"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5","Type":"ContainerStarted","Data":"247f6e55f13d862f3588d8b327bf20ff1ed04884c3fbbb72792afb55373a884a"} Nov 29 15:21:54 crc kubenswrapper[4907]: I1129 15:21:54.900752 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" podStartSLOduration=2.330451595 podStartE2EDuration="2.900729021s" podCreationTimestamp="2025-11-29 15:21:52 +0000 UTC" firstStartedPulling="2025-11-29 15:21:53.67090739 +0000 UTC m=+3211.657745042" lastFinishedPulling="2025-11-29 15:21:54.241184816 +0000 UTC m=+3212.228022468" observedRunningTime="2025-11-29 15:21:54.844745431 +0000 UTC m=+3212.831583093" watchObservedRunningTime="2025-11-29 15:21:54.900729021 +0000 UTC m=+3212.887566673" Nov 29 15:21:58 crc kubenswrapper[4907]: I1129 15:21:58.480707 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:21:58 crc kubenswrapper[4907]: E1129 15:21:58.481392 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:22:12 crc kubenswrapper[4907]: I1129 15:22:12.061259 4907 generic.go:334] "Generic (PLEG): container finished" podID="f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5" containerID="247f6e55f13d862f3588d8b327bf20ff1ed04884c3fbbb72792afb55373a884a" exitCode=0 Nov 29 15:22:12 crc kubenswrapper[4907]: I1129 15:22:12.061388 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" event={"ID":"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5","Type":"ContainerDied","Data":"247f6e55f13d862f3588d8b327bf20ff1ed04884c3fbbb72792afb55373a884a"} Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.480813 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:22:13 crc kubenswrapper[4907]: E1129 15:22:13.481351 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.589633 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.724949 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdgvl\" (UniqueName: \"kubernetes.io/projected/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-kube-api-access-zdgvl\") pod \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.725043 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-logging-compute-config-data-0\") pod \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.725198 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-inventory\") pod \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.725250 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-logging-compute-config-data-1\") pod \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.725288 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-ssh-key\") pod \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\" (UID: \"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5\") " Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.732702 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-kube-api-access-zdgvl" (OuterVolumeSpecName: "kube-api-access-zdgvl") pod "f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5" (UID: "f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5"). InnerVolumeSpecName "kube-api-access-zdgvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.758295 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5" (UID: "f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.768247 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5" (UID: "f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.783207 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5" (UID: "f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.785896 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-inventory" (OuterVolumeSpecName: "inventory") pod "f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5" (UID: "f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.827505 4907 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.827537 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.827547 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdgvl\" (UniqueName: \"kubernetes.io/projected/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-kube-api-access-zdgvl\") on node \"crc\" DevicePath \"\"" Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.827558 4907 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Nov 29 15:22:13 crc kubenswrapper[4907]: I1129 15:22:13.827567 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 15:22:14 crc kubenswrapper[4907]: I1129 15:22:14.104258 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" event={"ID":"f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5","Type":"ContainerDied","Data":"ed737ebd562ea03cd0f506b61de6e0d7f1086bf76a3cb0767d7480f1a4d55381"} Nov 29 15:22:14 crc kubenswrapper[4907]: I1129 15:22:14.104306 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed737ebd562ea03cd0f506b61de6e0d7f1086bf76a3cb0767d7480f1a4d55381" Nov 29 15:22:14 crc kubenswrapper[4907]: I1129 15:22:14.104421 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tmrrg" Nov 29 15:22:28 crc kubenswrapper[4907]: I1129 15:22:28.479596 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:22:28 crc kubenswrapper[4907]: E1129 15:22:28.480417 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:22:42 crc kubenswrapper[4907]: I1129 15:22:42.497386 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:22:42 crc kubenswrapper[4907]: E1129 15:22:42.499219 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:22:53 crc kubenswrapper[4907]: I1129 15:22:53.479560 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:22:53 crc kubenswrapper[4907]: E1129 15:22:53.480431 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:23:06 crc kubenswrapper[4907]: I1129 15:23:06.485386 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:23:06 crc kubenswrapper[4907]: I1129 15:23:06.914531 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"6ed981b048774d1934113df4002fcd7e0133f0776fafb6b129a9cbd0e9afbaab"} Nov 29 15:24:08 crc kubenswrapper[4907]: I1129 15:24:08.918739 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hkvz9"] Nov 29 15:24:08 crc kubenswrapper[4907]: E1129 15:24:08.919868 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5" containerName="logging-edpm-deployment-openstack-edpm-ipam" Nov 29 15:24:08 crc kubenswrapper[4907]: I1129 15:24:08.919886 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5" containerName="logging-edpm-deployment-openstack-edpm-ipam" Nov 29 15:24:08 crc kubenswrapper[4907]: I1129 15:24:08.920206 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5" containerName="logging-edpm-deployment-openstack-edpm-ipam" Nov 29 15:24:08 crc kubenswrapper[4907]: I1129 15:24:08.922370 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:08 crc kubenswrapper[4907]: I1129 15:24:08.945771 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hkvz9"] Nov 29 15:24:08 crc kubenswrapper[4907]: I1129 15:24:08.992744 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb626\" (UniqueName: \"kubernetes.io/projected/7a3ffdb7-6209-4231-b11d-f65de8798d27-kube-api-access-jb626\") pod \"community-operators-hkvz9\" (UID: \"7a3ffdb7-6209-4231-b11d-f65de8798d27\") " pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:08 crc kubenswrapper[4907]: I1129 15:24:08.992846 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3ffdb7-6209-4231-b11d-f65de8798d27-catalog-content\") pod \"community-operators-hkvz9\" (UID: \"7a3ffdb7-6209-4231-b11d-f65de8798d27\") " pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:08 crc kubenswrapper[4907]: I1129 15:24:08.993077 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3ffdb7-6209-4231-b11d-f65de8798d27-utilities\") pod \"community-operators-hkvz9\" (UID: \"7a3ffdb7-6209-4231-b11d-f65de8798d27\") " pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:09 crc kubenswrapper[4907]: I1129 15:24:09.097077 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb626\" (UniqueName: \"kubernetes.io/projected/7a3ffdb7-6209-4231-b11d-f65de8798d27-kube-api-access-jb626\") pod \"community-operators-hkvz9\" (UID: \"7a3ffdb7-6209-4231-b11d-f65de8798d27\") " pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:09 crc kubenswrapper[4907]: I1129 15:24:09.097415 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3ffdb7-6209-4231-b11d-f65de8798d27-catalog-content\") pod \"community-operators-hkvz9\" (UID: \"7a3ffdb7-6209-4231-b11d-f65de8798d27\") " pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:09 crc kubenswrapper[4907]: I1129 15:24:09.097901 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3ffdb7-6209-4231-b11d-f65de8798d27-catalog-content\") pod \"community-operators-hkvz9\" (UID: \"7a3ffdb7-6209-4231-b11d-f65de8798d27\") " pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:09 crc kubenswrapper[4907]: I1129 15:24:09.097933 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3ffdb7-6209-4231-b11d-f65de8798d27-utilities\") pod \"community-operators-hkvz9\" (UID: \"7a3ffdb7-6209-4231-b11d-f65de8798d27\") " pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:09 crc kubenswrapper[4907]: I1129 15:24:09.098122 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3ffdb7-6209-4231-b11d-f65de8798d27-utilities\") pod \"community-operators-hkvz9\" (UID: \"7a3ffdb7-6209-4231-b11d-f65de8798d27\") " pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:09 crc kubenswrapper[4907]: I1129 15:24:09.127392 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb626\" (UniqueName: \"kubernetes.io/projected/7a3ffdb7-6209-4231-b11d-f65de8798d27-kube-api-access-jb626\") pod \"community-operators-hkvz9\" (UID: \"7a3ffdb7-6209-4231-b11d-f65de8798d27\") " pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:09 crc kubenswrapper[4907]: I1129 15:24:09.248000 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:09 crc kubenswrapper[4907]: I1129 15:24:09.809780 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hkvz9"] Nov 29 15:24:10 crc kubenswrapper[4907]: I1129 15:24:10.839977 4907 generic.go:334] "Generic (PLEG): container finished" podID="7a3ffdb7-6209-4231-b11d-f65de8798d27" containerID="c789734a723b9f4055fb79cb8585608171dd13e516da4bc5ecac1b1fdaf5f947" exitCode=0 Nov 29 15:24:10 crc kubenswrapper[4907]: I1129 15:24:10.840071 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hkvz9" event={"ID":"7a3ffdb7-6209-4231-b11d-f65de8798d27","Type":"ContainerDied","Data":"c789734a723b9f4055fb79cb8585608171dd13e516da4bc5ecac1b1fdaf5f947"} Nov 29 15:24:10 crc kubenswrapper[4907]: I1129 15:24:10.841594 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hkvz9" event={"ID":"7a3ffdb7-6209-4231-b11d-f65de8798d27","Type":"ContainerStarted","Data":"8c50fcd776e04886736c6d725c84a13814c1ed754c9551ee6e619c1265f0f610"} Nov 29 15:24:12 crc kubenswrapper[4907]: I1129 15:24:12.865033 4907 generic.go:334] "Generic (PLEG): container finished" podID="7a3ffdb7-6209-4231-b11d-f65de8798d27" containerID="3eaaa3492312113742ef68d2ec1352b60ae1bfbe0ae350381d004de7d50f6d94" exitCode=0 Nov 29 15:24:12 crc kubenswrapper[4907]: I1129 15:24:12.865084 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hkvz9" event={"ID":"7a3ffdb7-6209-4231-b11d-f65de8798d27","Type":"ContainerDied","Data":"3eaaa3492312113742ef68d2ec1352b60ae1bfbe0ae350381d004de7d50f6d94"} Nov 29 15:24:13 crc kubenswrapper[4907]: I1129 15:24:13.882564 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hkvz9" event={"ID":"7a3ffdb7-6209-4231-b11d-f65de8798d27","Type":"ContainerStarted","Data":"d0ec57fc5f202f8526f4aded1d6b1653711ad7c51f67c4b0a41a02a92ab86d40"} Nov 29 15:24:13 crc kubenswrapper[4907]: I1129 15:24:13.925330 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hkvz9" podStartSLOduration=3.447214123 podStartE2EDuration="5.925296274s" podCreationTimestamp="2025-11-29 15:24:08 +0000 UTC" firstStartedPulling="2025-11-29 15:24:10.84656329 +0000 UTC m=+3348.833400932" lastFinishedPulling="2025-11-29 15:24:13.324645421 +0000 UTC m=+3351.311483083" observedRunningTime="2025-11-29 15:24:13.91382214 +0000 UTC m=+3351.900659802" watchObservedRunningTime="2025-11-29 15:24:13.925296274 +0000 UTC m=+3351.912133966" Nov 29 15:24:19 crc kubenswrapper[4907]: I1129 15:24:19.248961 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:19 crc kubenswrapper[4907]: I1129 15:24:19.249553 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:19 crc kubenswrapper[4907]: I1129 15:24:19.317167 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:20 crc kubenswrapper[4907]: I1129 15:24:20.048734 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:20 crc kubenswrapper[4907]: I1129 15:24:20.120062 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hkvz9"] Nov 29 15:24:21 crc kubenswrapper[4907]: I1129 15:24:21.995792 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hkvz9" podUID="7a3ffdb7-6209-4231-b11d-f65de8798d27" containerName="registry-server" containerID="cri-o://d0ec57fc5f202f8526f4aded1d6b1653711ad7c51f67c4b0a41a02a92ab86d40" gracePeriod=2 Nov 29 15:24:22 crc kubenswrapper[4907]: I1129 15:24:22.622655 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:22 crc kubenswrapper[4907]: I1129 15:24:22.720144 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3ffdb7-6209-4231-b11d-f65de8798d27-catalog-content\") pod \"7a3ffdb7-6209-4231-b11d-f65de8798d27\" (UID: \"7a3ffdb7-6209-4231-b11d-f65de8798d27\") " Nov 29 15:24:22 crc kubenswrapper[4907]: I1129 15:24:22.720449 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3ffdb7-6209-4231-b11d-f65de8798d27-utilities\") pod \"7a3ffdb7-6209-4231-b11d-f65de8798d27\" (UID: \"7a3ffdb7-6209-4231-b11d-f65de8798d27\") " Nov 29 15:24:22 crc kubenswrapper[4907]: I1129 15:24:22.720622 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb626\" (UniqueName: \"kubernetes.io/projected/7a3ffdb7-6209-4231-b11d-f65de8798d27-kube-api-access-jb626\") pod \"7a3ffdb7-6209-4231-b11d-f65de8798d27\" (UID: \"7a3ffdb7-6209-4231-b11d-f65de8798d27\") " Nov 29 15:24:22 crc kubenswrapper[4907]: I1129 15:24:22.722776 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a3ffdb7-6209-4231-b11d-f65de8798d27-utilities" (OuterVolumeSpecName: "utilities") pod "7a3ffdb7-6209-4231-b11d-f65de8798d27" (UID: "7a3ffdb7-6209-4231-b11d-f65de8798d27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:24:22 crc kubenswrapper[4907]: I1129 15:24:22.748596 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a3ffdb7-6209-4231-b11d-f65de8798d27-kube-api-access-jb626" (OuterVolumeSpecName: "kube-api-access-jb626") pod "7a3ffdb7-6209-4231-b11d-f65de8798d27" (UID: "7a3ffdb7-6209-4231-b11d-f65de8798d27"). InnerVolumeSpecName "kube-api-access-jb626". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:24:22 crc kubenswrapper[4907]: I1129 15:24:22.809272 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a3ffdb7-6209-4231-b11d-f65de8798d27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a3ffdb7-6209-4231-b11d-f65de8798d27" (UID: "7a3ffdb7-6209-4231-b11d-f65de8798d27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:24:22 crc kubenswrapper[4907]: I1129 15:24:22.825160 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3ffdb7-6209-4231-b11d-f65de8798d27-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:24:22 crc kubenswrapper[4907]: I1129 15:24:22.825205 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3ffdb7-6209-4231-b11d-f65de8798d27-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:24:22 crc kubenswrapper[4907]: I1129 15:24:22.825217 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb626\" (UniqueName: \"kubernetes.io/projected/7a3ffdb7-6209-4231-b11d-f65de8798d27-kube-api-access-jb626\") on node \"crc\" DevicePath \"\"" Nov 29 15:24:23 crc kubenswrapper[4907]: I1129 15:24:23.012154 4907 generic.go:334] "Generic (PLEG): container finished" podID="7a3ffdb7-6209-4231-b11d-f65de8798d27" containerID="d0ec57fc5f202f8526f4aded1d6b1653711ad7c51f67c4b0a41a02a92ab86d40" exitCode=0 Nov 29 15:24:23 crc kubenswrapper[4907]: I1129 15:24:23.012203 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hkvz9" event={"ID":"7a3ffdb7-6209-4231-b11d-f65de8798d27","Type":"ContainerDied","Data":"d0ec57fc5f202f8526f4aded1d6b1653711ad7c51f67c4b0a41a02a92ab86d40"} Nov 29 15:24:23 crc kubenswrapper[4907]: I1129 15:24:23.012234 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hkvz9" event={"ID":"7a3ffdb7-6209-4231-b11d-f65de8798d27","Type":"ContainerDied","Data":"8c50fcd776e04886736c6d725c84a13814c1ed754c9551ee6e619c1265f0f610"} Nov 29 15:24:23 crc kubenswrapper[4907]: I1129 15:24:23.012256 4907 scope.go:117] "RemoveContainer" containerID="d0ec57fc5f202f8526f4aded1d6b1653711ad7c51f67c4b0a41a02a92ab86d40" Nov 29 15:24:23 crc kubenswrapper[4907]: I1129 15:24:23.012326 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hkvz9" Nov 29 15:24:23 crc kubenswrapper[4907]: I1129 15:24:23.050084 4907 scope.go:117] "RemoveContainer" containerID="3eaaa3492312113742ef68d2ec1352b60ae1bfbe0ae350381d004de7d50f6d94" Nov 29 15:24:23 crc kubenswrapper[4907]: I1129 15:24:23.077985 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hkvz9"] Nov 29 15:24:23 crc kubenswrapper[4907]: I1129 15:24:23.092836 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hkvz9"] Nov 29 15:24:23 crc kubenswrapper[4907]: I1129 15:24:23.134602 4907 scope.go:117] "RemoveContainer" containerID="c789734a723b9f4055fb79cb8585608171dd13e516da4bc5ecac1b1fdaf5f947" Nov 29 15:24:23 crc kubenswrapper[4907]: I1129 15:24:23.163641 4907 scope.go:117] "RemoveContainer" containerID="d0ec57fc5f202f8526f4aded1d6b1653711ad7c51f67c4b0a41a02a92ab86d40" Nov 29 15:24:23 crc kubenswrapper[4907]: E1129 15:24:23.164309 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0ec57fc5f202f8526f4aded1d6b1653711ad7c51f67c4b0a41a02a92ab86d40\": container with ID starting with d0ec57fc5f202f8526f4aded1d6b1653711ad7c51f67c4b0a41a02a92ab86d40 not found: ID does not exist" containerID="d0ec57fc5f202f8526f4aded1d6b1653711ad7c51f67c4b0a41a02a92ab86d40" Nov 29 15:24:23 crc kubenswrapper[4907]: I1129 15:24:23.164357 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ec57fc5f202f8526f4aded1d6b1653711ad7c51f67c4b0a41a02a92ab86d40"} err="failed to get container status \"d0ec57fc5f202f8526f4aded1d6b1653711ad7c51f67c4b0a41a02a92ab86d40\": rpc error: code = NotFound desc = could not find container \"d0ec57fc5f202f8526f4aded1d6b1653711ad7c51f67c4b0a41a02a92ab86d40\": container with ID starting with d0ec57fc5f202f8526f4aded1d6b1653711ad7c51f67c4b0a41a02a92ab86d40 not found: ID does not exist" Nov 29 15:24:23 crc kubenswrapper[4907]: I1129 15:24:23.164387 4907 scope.go:117] "RemoveContainer" containerID="3eaaa3492312113742ef68d2ec1352b60ae1bfbe0ae350381d004de7d50f6d94" Nov 29 15:24:23 crc kubenswrapper[4907]: E1129 15:24:23.165846 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eaaa3492312113742ef68d2ec1352b60ae1bfbe0ae350381d004de7d50f6d94\": container with ID starting with 3eaaa3492312113742ef68d2ec1352b60ae1bfbe0ae350381d004de7d50f6d94 not found: ID does not exist" containerID="3eaaa3492312113742ef68d2ec1352b60ae1bfbe0ae350381d004de7d50f6d94" Nov 29 15:24:23 crc kubenswrapper[4907]: I1129 15:24:23.165888 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eaaa3492312113742ef68d2ec1352b60ae1bfbe0ae350381d004de7d50f6d94"} err="failed to get container status \"3eaaa3492312113742ef68d2ec1352b60ae1bfbe0ae350381d004de7d50f6d94\": rpc error: code = NotFound desc = could not find container \"3eaaa3492312113742ef68d2ec1352b60ae1bfbe0ae350381d004de7d50f6d94\": container with ID starting with 3eaaa3492312113742ef68d2ec1352b60ae1bfbe0ae350381d004de7d50f6d94 not found: ID does not exist" Nov 29 15:24:23 crc kubenswrapper[4907]: I1129 15:24:23.165916 4907 scope.go:117] "RemoveContainer" containerID="c789734a723b9f4055fb79cb8585608171dd13e516da4bc5ecac1b1fdaf5f947" Nov 29 15:24:23 crc kubenswrapper[4907]: E1129 15:24:23.166264 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c789734a723b9f4055fb79cb8585608171dd13e516da4bc5ecac1b1fdaf5f947\": container with ID starting with c789734a723b9f4055fb79cb8585608171dd13e516da4bc5ecac1b1fdaf5f947 not found: ID does not exist" containerID="c789734a723b9f4055fb79cb8585608171dd13e516da4bc5ecac1b1fdaf5f947" Nov 29 15:24:23 crc kubenswrapper[4907]: I1129 15:24:23.166293 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c789734a723b9f4055fb79cb8585608171dd13e516da4bc5ecac1b1fdaf5f947"} err="failed to get container status \"c789734a723b9f4055fb79cb8585608171dd13e516da4bc5ecac1b1fdaf5f947\": rpc error: code = NotFound desc = could not find container \"c789734a723b9f4055fb79cb8585608171dd13e516da4bc5ecac1b1fdaf5f947\": container with ID starting with c789734a723b9f4055fb79cb8585608171dd13e516da4bc5ecac1b1fdaf5f947 not found: ID does not exist" Nov 29 15:24:24 crc kubenswrapper[4907]: I1129 15:24:24.499561 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a3ffdb7-6209-4231-b11d-f65de8798d27" path="/var/lib/kubelet/pods/7a3ffdb7-6209-4231-b11d-f65de8798d27/volumes" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.250409 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b428w"] Nov 29 15:25:20 crc kubenswrapper[4907]: E1129 15:25:20.251881 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3ffdb7-6209-4231-b11d-f65de8798d27" containerName="extract-utilities" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.251901 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3ffdb7-6209-4231-b11d-f65de8798d27" containerName="extract-utilities" Nov 29 15:25:20 crc kubenswrapper[4907]: E1129 15:25:20.251954 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3ffdb7-6209-4231-b11d-f65de8798d27" containerName="extract-content" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.251970 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3ffdb7-6209-4231-b11d-f65de8798d27" containerName="extract-content" Nov 29 15:25:20 crc kubenswrapper[4907]: E1129 15:25:20.252019 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3ffdb7-6209-4231-b11d-f65de8798d27" containerName="registry-server" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.252033 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3ffdb7-6209-4231-b11d-f65de8798d27" containerName="registry-server" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.252405 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a3ffdb7-6209-4231-b11d-f65de8798d27" containerName="registry-server" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.254665 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.306510 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b428w"] Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.376479 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdq2r\" (UniqueName: \"kubernetes.io/projected/0d6262b2-5a50-49e6-8c46-5315d1d28cec-kube-api-access-rdq2r\") pod \"redhat-operators-b428w\" (UID: \"0d6262b2-5a50-49e6-8c46-5315d1d28cec\") " pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.376571 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d6262b2-5a50-49e6-8c46-5315d1d28cec-utilities\") pod \"redhat-operators-b428w\" (UID: \"0d6262b2-5a50-49e6-8c46-5315d1d28cec\") " pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.377059 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d6262b2-5a50-49e6-8c46-5315d1d28cec-catalog-content\") pod \"redhat-operators-b428w\" (UID: \"0d6262b2-5a50-49e6-8c46-5315d1d28cec\") " pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.479120 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d6262b2-5a50-49e6-8c46-5315d1d28cec-catalog-content\") pod \"redhat-operators-b428w\" (UID: \"0d6262b2-5a50-49e6-8c46-5315d1d28cec\") " pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.479221 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdq2r\" (UniqueName: \"kubernetes.io/projected/0d6262b2-5a50-49e6-8c46-5315d1d28cec-kube-api-access-rdq2r\") pod \"redhat-operators-b428w\" (UID: \"0d6262b2-5a50-49e6-8c46-5315d1d28cec\") " pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.479255 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d6262b2-5a50-49e6-8c46-5315d1d28cec-utilities\") pod \"redhat-operators-b428w\" (UID: \"0d6262b2-5a50-49e6-8c46-5315d1d28cec\") " pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.479828 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d6262b2-5a50-49e6-8c46-5315d1d28cec-utilities\") pod \"redhat-operators-b428w\" (UID: \"0d6262b2-5a50-49e6-8c46-5315d1d28cec\") " pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.479901 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d6262b2-5a50-49e6-8c46-5315d1d28cec-catalog-content\") pod \"redhat-operators-b428w\" (UID: \"0d6262b2-5a50-49e6-8c46-5315d1d28cec\") " pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.503585 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdq2r\" (UniqueName: \"kubernetes.io/projected/0d6262b2-5a50-49e6-8c46-5315d1d28cec-kube-api-access-rdq2r\") pod \"redhat-operators-b428w\" (UID: \"0d6262b2-5a50-49e6-8c46-5315d1d28cec\") " pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:20 crc kubenswrapper[4907]: I1129 15:25:20.584639 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:21 crc kubenswrapper[4907]: I1129 15:25:21.101213 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b428w"] Nov 29 15:25:21 crc kubenswrapper[4907]: I1129 15:25:21.168369 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b428w" event={"ID":"0d6262b2-5a50-49e6-8c46-5315d1d28cec","Type":"ContainerStarted","Data":"02a85f422f518ff0ef55518811d6f6d11ce157fd8bd2d76e6e832997d5e461d5"} Nov 29 15:25:22 crc kubenswrapper[4907]: I1129 15:25:22.198369 4907 generic.go:334] "Generic (PLEG): container finished" podID="0d6262b2-5a50-49e6-8c46-5315d1d28cec" containerID="73d5fb5f787175bd839a53d5ead9c27fa938e3beeacd56a652f194893743b123" exitCode=0 Nov 29 15:25:22 crc kubenswrapper[4907]: I1129 15:25:22.199045 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b428w" event={"ID":"0d6262b2-5a50-49e6-8c46-5315d1d28cec","Type":"ContainerDied","Data":"73d5fb5f787175bd839a53d5ead9c27fa938e3beeacd56a652f194893743b123"} Nov 29 15:25:22 crc kubenswrapper[4907]: I1129 15:25:22.203532 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 15:25:24 crc kubenswrapper[4907]: I1129 15:25:24.229757 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b428w" event={"ID":"0d6262b2-5a50-49e6-8c46-5315d1d28cec","Type":"ContainerStarted","Data":"ea1f4918f4fe46e7addcb4f419785597e7f23ed6c5ebb07a698536818768e5a9"} Nov 29 15:25:26 crc kubenswrapper[4907]: I1129 15:25:26.265231 4907 generic.go:334] "Generic (PLEG): container finished" podID="0d6262b2-5a50-49e6-8c46-5315d1d28cec" containerID="ea1f4918f4fe46e7addcb4f419785597e7f23ed6c5ebb07a698536818768e5a9" exitCode=0 Nov 29 15:25:26 crc kubenswrapper[4907]: I1129 15:25:26.265351 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b428w" event={"ID":"0d6262b2-5a50-49e6-8c46-5315d1d28cec","Type":"ContainerDied","Data":"ea1f4918f4fe46e7addcb4f419785597e7f23ed6c5ebb07a698536818768e5a9"} Nov 29 15:25:27 crc kubenswrapper[4907]: I1129 15:25:27.292923 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b428w" event={"ID":"0d6262b2-5a50-49e6-8c46-5315d1d28cec","Type":"ContainerStarted","Data":"149c524bce4762a598b83b5ce7e712a106e8485ec7c99dd2d8e893b614f699b3"} Nov 29 15:25:27 crc kubenswrapper[4907]: I1129 15:25:27.326122 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b428w" podStartSLOduration=2.630871282 podStartE2EDuration="7.32609628s" podCreationTimestamp="2025-11-29 15:25:20 +0000 UTC" firstStartedPulling="2025-11-29 15:25:22.202868792 +0000 UTC m=+3420.189706504" lastFinishedPulling="2025-11-29 15:25:26.89809382 +0000 UTC m=+3424.884931502" observedRunningTime="2025-11-29 15:25:27.322288322 +0000 UTC m=+3425.309125974" watchObservedRunningTime="2025-11-29 15:25:27.32609628 +0000 UTC m=+3425.312933942" Nov 29 15:25:28 crc kubenswrapper[4907]: I1129 15:25:28.490037 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:25:28 crc kubenswrapper[4907]: I1129 15:25:28.490346 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:25:30 crc kubenswrapper[4907]: I1129 15:25:30.585371 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:30 crc kubenswrapper[4907]: I1129 15:25:30.586043 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:31 crc kubenswrapper[4907]: I1129 15:25:31.648241 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b428w" podUID="0d6262b2-5a50-49e6-8c46-5315d1d28cec" containerName="registry-server" probeResult="failure" output=< Nov 29 15:25:31 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 15:25:31 crc kubenswrapper[4907]: > Nov 29 15:25:40 crc kubenswrapper[4907]: I1129 15:25:40.672418 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:40 crc kubenswrapper[4907]: I1129 15:25:40.748302 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:42 crc kubenswrapper[4907]: I1129 15:25:42.199625 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b428w"] Nov 29 15:25:42 crc kubenswrapper[4907]: I1129 15:25:42.495223 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b428w" podUID="0d6262b2-5a50-49e6-8c46-5315d1d28cec" containerName="registry-server" containerID="cri-o://149c524bce4762a598b83b5ce7e712a106e8485ec7c99dd2d8e893b614f699b3" gracePeriod=2 Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.042356 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.130990 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d6262b2-5a50-49e6-8c46-5315d1d28cec-utilities\") pod \"0d6262b2-5a50-49e6-8c46-5315d1d28cec\" (UID: \"0d6262b2-5a50-49e6-8c46-5315d1d28cec\") " Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.131053 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d6262b2-5a50-49e6-8c46-5315d1d28cec-catalog-content\") pod \"0d6262b2-5a50-49e6-8c46-5315d1d28cec\" (UID: \"0d6262b2-5a50-49e6-8c46-5315d1d28cec\") " Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.131301 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdq2r\" (UniqueName: \"kubernetes.io/projected/0d6262b2-5a50-49e6-8c46-5315d1d28cec-kube-api-access-rdq2r\") pod \"0d6262b2-5a50-49e6-8c46-5315d1d28cec\" (UID: \"0d6262b2-5a50-49e6-8c46-5315d1d28cec\") " Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.131734 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d6262b2-5a50-49e6-8c46-5315d1d28cec-utilities" (OuterVolumeSpecName: "utilities") pod "0d6262b2-5a50-49e6-8c46-5315d1d28cec" (UID: "0d6262b2-5a50-49e6-8c46-5315d1d28cec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.132540 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d6262b2-5a50-49e6-8c46-5315d1d28cec-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.147090 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d6262b2-5a50-49e6-8c46-5315d1d28cec-kube-api-access-rdq2r" (OuterVolumeSpecName: "kube-api-access-rdq2r") pod "0d6262b2-5a50-49e6-8c46-5315d1d28cec" (UID: "0d6262b2-5a50-49e6-8c46-5315d1d28cec"). InnerVolumeSpecName "kube-api-access-rdq2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.234279 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdq2r\" (UniqueName: \"kubernetes.io/projected/0d6262b2-5a50-49e6-8c46-5315d1d28cec-kube-api-access-rdq2r\") on node \"crc\" DevicePath \"\"" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.252358 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d6262b2-5a50-49e6-8c46-5315d1d28cec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d6262b2-5a50-49e6-8c46-5315d1d28cec" (UID: "0d6262b2-5a50-49e6-8c46-5315d1d28cec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.336822 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d6262b2-5a50-49e6-8c46-5315d1d28cec-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.514874 4907 generic.go:334] "Generic (PLEG): container finished" podID="0d6262b2-5a50-49e6-8c46-5315d1d28cec" containerID="149c524bce4762a598b83b5ce7e712a106e8485ec7c99dd2d8e893b614f699b3" exitCode=0 Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.515066 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b428w" event={"ID":"0d6262b2-5a50-49e6-8c46-5315d1d28cec","Type":"ContainerDied","Data":"149c524bce4762a598b83b5ce7e712a106e8485ec7c99dd2d8e893b614f699b3"} Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.515167 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b428w" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.515188 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b428w" event={"ID":"0d6262b2-5a50-49e6-8c46-5315d1d28cec","Type":"ContainerDied","Data":"02a85f422f518ff0ef55518811d6f6d11ce157fd8bd2d76e6e832997d5e461d5"} Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.515218 4907 scope.go:117] "RemoveContainer" containerID="149c524bce4762a598b83b5ce7e712a106e8485ec7c99dd2d8e893b614f699b3" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.548626 4907 scope.go:117] "RemoveContainer" containerID="ea1f4918f4fe46e7addcb4f419785597e7f23ed6c5ebb07a698536818768e5a9" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.560741 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b428w"] Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.570099 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b428w"] Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.570940 4907 scope.go:117] "RemoveContainer" containerID="73d5fb5f787175bd839a53d5ead9c27fa938e3beeacd56a652f194893743b123" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.620481 4907 scope.go:117] "RemoveContainer" containerID="149c524bce4762a598b83b5ce7e712a106e8485ec7c99dd2d8e893b614f699b3" Nov 29 15:25:43 crc kubenswrapper[4907]: E1129 15:25:43.620854 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"149c524bce4762a598b83b5ce7e712a106e8485ec7c99dd2d8e893b614f699b3\": container with ID starting with 149c524bce4762a598b83b5ce7e712a106e8485ec7c99dd2d8e893b614f699b3 not found: ID does not exist" containerID="149c524bce4762a598b83b5ce7e712a106e8485ec7c99dd2d8e893b614f699b3" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.620891 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"149c524bce4762a598b83b5ce7e712a106e8485ec7c99dd2d8e893b614f699b3"} err="failed to get container status \"149c524bce4762a598b83b5ce7e712a106e8485ec7c99dd2d8e893b614f699b3\": rpc error: code = NotFound desc = could not find container \"149c524bce4762a598b83b5ce7e712a106e8485ec7c99dd2d8e893b614f699b3\": container with ID starting with 149c524bce4762a598b83b5ce7e712a106e8485ec7c99dd2d8e893b614f699b3 not found: ID does not exist" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.620913 4907 scope.go:117] "RemoveContainer" containerID="ea1f4918f4fe46e7addcb4f419785597e7f23ed6c5ebb07a698536818768e5a9" Nov 29 15:25:43 crc kubenswrapper[4907]: E1129 15:25:43.621179 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea1f4918f4fe46e7addcb4f419785597e7f23ed6c5ebb07a698536818768e5a9\": container with ID starting with ea1f4918f4fe46e7addcb4f419785597e7f23ed6c5ebb07a698536818768e5a9 not found: ID does not exist" containerID="ea1f4918f4fe46e7addcb4f419785597e7f23ed6c5ebb07a698536818768e5a9" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.621221 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea1f4918f4fe46e7addcb4f419785597e7f23ed6c5ebb07a698536818768e5a9"} err="failed to get container status \"ea1f4918f4fe46e7addcb4f419785597e7f23ed6c5ebb07a698536818768e5a9\": rpc error: code = NotFound desc = could not find container \"ea1f4918f4fe46e7addcb4f419785597e7f23ed6c5ebb07a698536818768e5a9\": container with ID starting with ea1f4918f4fe46e7addcb4f419785597e7f23ed6c5ebb07a698536818768e5a9 not found: ID does not exist" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.621248 4907 scope.go:117] "RemoveContainer" containerID="73d5fb5f787175bd839a53d5ead9c27fa938e3beeacd56a652f194893743b123" Nov 29 15:25:43 crc kubenswrapper[4907]: E1129 15:25:43.621732 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73d5fb5f787175bd839a53d5ead9c27fa938e3beeacd56a652f194893743b123\": container with ID starting with 73d5fb5f787175bd839a53d5ead9c27fa938e3beeacd56a652f194893743b123 not found: ID does not exist" containerID="73d5fb5f787175bd839a53d5ead9c27fa938e3beeacd56a652f194893743b123" Nov 29 15:25:43 crc kubenswrapper[4907]: I1129 15:25:43.621759 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d5fb5f787175bd839a53d5ead9c27fa938e3beeacd56a652f194893743b123"} err="failed to get container status \"73d5fb5f787175bd839a53d5ead9c27fa938e3beeacd56a652f194893743b123\": rpc error: code = NotFound desc = could not find container \"73d5fb5f787175bd839a53d5ead9c27fa938e3beeacd56a652f194893743b123\": container with ID starting with 73d5fb5f787175bd839a53d5ead9c27fa938e3beeacd56a652f194893743b123 not found: ID does not exist" Nov 29 15:25:44 crc kubenswrapper[4907]: I1129 15:25:44.509322 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d6262b2-5a50-49e6-8c46-5315d1d28cec" path="/var/lib/kubelet/pods/0d6262b2-5a50-49e6-8c46-5315d1d28cec/volumes" Nov 29 15:25:58 crc kubenswrapper[4907]: I1129 15:25:58.490428 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:25:58 crc kubenswrapper[4907]: I1129 15:25:58.491247 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:26:28 crc kubenswrapper[4907]: I1129 15:26:28.490565 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:26:28 crc kubenswrapper[4907]: I1129 15:26:28.491210 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:26:28 crc kubenswrapper[4907]: I1129 15:26:28.495395 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 15:26:28 crc kubenswrapper[4907]: I1129 15:26:28.496664 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ed981b048774d1934113df4002fcd7e0133f0776fafb6b129a9cbd0e9afbaab"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 15:26:28 crc kubenswrapper[4907]: I1129 15:26:28.496778 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://6ed981b048774d1934113df4002fcd7e0133f0776fafb6b129a9cbd0e9afbaab" gracePeriod=600 Nov 29 15:26:29 crc kubenswrapper[4907]: I1129 15:26:29.130909 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="6ed981b048774d1934113df4002fcd7e0133f0776fafb6b129a9cbd0e9afbaab" exitCode=0 Nov 29 15:26:29 crc kubenswrapper[4907]: I1129 15:26:29.131102 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"6ed981b048774d1934113df4002fcd7e0133f0776fafb6b129a9cbd0e9afbaab"} Nov 29 15:26:29 crc kubenswrapper[4907]: I1129 15:26:29.131263 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2"} Nov 29 15:26:29 crc kubenswrapper[4907]: I1129 15:26:29.131287 4907 scope.go:117] "RemoveContainer" containerID="978b4d994d089c3278e49a18bae6feb87730feb3548d49d04d6c48b57eaaf3ca" Nov 29 15:28:28 crc kubenswrapper[4907]: I1129 15:28:28.490444 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:28:28 crc kubenswrapper[4907]: I1129 15:28:28.491211 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:28:48 crc kubenswrapper[4907]: E1129 15:28:48.542505 4907 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.47:41946->38.102.83.47:43783: read tcp 38.102.83.47:41946->38.102.83.47:43783: read: connection reset by peer Nov 29 15:28:58 crc kubenswrapper[4907]: I1129 15:28:58.489936 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:28:58 crc kubenswrapper[4907]: I1129 15:28:58.490588 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:29:28 crc kubenswrapper[4907]: I1129 15:29:28.490010 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:29:28 crc kubenswrapper[4907]: I1129 15:29:28.490619 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:29:28 crc kubenswrapper[4907]: I1129 15:29:28.498319 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 15:29:28 crc kubenswrapper[4907]: I1129 15:29:28.500234 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 15:29:28 crc kubenswrapper[4907]: I1129 15:29:28.500347 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" gracePeriod=600 Nov 29 15:29:28 crc kubenswrapper[4907]: E1129 15:29:28.642971 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:29:28 crc kubenswrapper[4907]: I1129 15:29:28.652488 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" exitCode=0 Nov 29 15:29:28 crc kubenswrapper[4907]: I1129 15:29:28.652583 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2"} Nov 29 15:29:28 crc kubenswrapper[4907]: I1129 15:29:28.652682 4907 scope.go:117] "RemoveContainer" containerID="6ed981b048774d1934113df4002fcd7e0133f0776fafb6b129a9cbd0e9afbaab" Nov 29 15:29:28 crc kubenswrapper[4907]: I1129 15:29:28.654143 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:29:28 crc kubenswrapper[4907]: E1129 15:29:28.654846 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:29:43 crc kubenswrapper[4907]: I1129 15:29:43.480647 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:29:43 crc kubenswrapper[4907]: E1129 15:29:43.481724 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:29:46 crc kubenswrapper[4907]: I1129 15:29:46.879598 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lk8wb"] Nov 29 15:29:46 crc kubenswrapper[4907]: E1129 15:29:46.880925 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d6262b2-5a50-49e6-8c46-5315d1d28cec" containerName="extract-content" Nov 29 15:29:46 crc kubenswrapper[4907]: I1129 15:29:46.880950 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d6262b2-5a50-49e6-8c46-5315d1d28cec" containerName="extract-content" Nov 29 15:29:46 crc kubenswrapper[4907]: E1129 15:29:46.881027 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d6262b2-5a50-49e6-8c46-5315d1d28cec" containerName="extract-utilities" Nov 29 15:29:46 crc kubenswrapper[4907]: I1129 15:29:46.881043 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d6262b2-5a50-49e6-8c46-5315d1d28cec" containerName="extract-utilities" Nov 29 15:29:46 crc kubenswrapper[4907]: E1129 15:29:46.881088 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d6262b2-5a50-49e6-8c46-5315d1d28cec" containerName="registry-server" Nov 29 15:29:46 crc kubenswrapper[4907]: I1129 15:29:46.881101 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d6262b2-5a50-49e6-8c46-5315d1d28cec" containerName="registry-server" Nov 29 15:29:46 crc kubenswrapper[4907]: I1129 15:29:46.881524 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d6262b2-5a50-49e6-8c46-5315d1d28cec" containerName="registry-server" Nov 29 15:29:46 crc kubenswrapper[4907]: I1129 15:29:46.888027 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:29:46 crc kubenswrapper[4907]: I1129 15:29:46.896682 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lk8wb"] Nov 29 15:29:47 crc kubenswrapper[4907]: I1129 15:29:47.040997 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-catalog-content\") pod \"redhat-marketplace-lk8wb\" (UID: \"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b\") " pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:29:47 crc kubenswrapper[4907]: I1129 15:29:47.041045 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-utilities\") pod \"redhat-marketplace-lk8wb\" (UID: \"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b\") " pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:29:47 crc kubenswrapper[4907]: I1129 15:29:47.041196 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf6pl\" (UniqueName: \"kubernetes.io/projected/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-kube-api-access-mf6pl\") pod \"redhat-marketplace-lk8wb\" (UID: \"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b\") " pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:29:47 crc kubenswrapper[4907]: I1129 15:29:47.143357 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf6pl\" (UniqueName: \"kubernetes.io/projected/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-kube-api-access-mf6pl\") pod \"redhat-marketplace-lk8wb\" (UID: \"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b\") " pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:29:47 crc kubenswrapper[4907]: I1129 15:29:47.143512 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-catalog-content\") pod \"redhat-marketplace-lk8wb\" (UID: \"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b\") " pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:29:47 crc kubenswrapper[4907]: I1129 15:29:47.143551 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-utilities\") pod \"redhat-marketplace-lk8wb\" (UID: \"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b\") " pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:29:47 crc kubenswrapper[4907]: I1129 15:29:47.144050 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-catalog-content\") pod \"redhat-marketplace-lk8wb\" (UID: \"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b\") " pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:29:47 crc kubenswrapper[4907]: I1129 15:29:47.144074 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-utilities\") pod \"redhat-marketplace-lk8wb\" (UID: \"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b\") " pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:29:47 crc kubenswrapper[4907]: I1129 15:29:47.168164 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf6pl\" (UniqueName: \"kubernetes.io/projected/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-kube-api-access-mf6pl\") pod \"redhat-marketplace-lk8wb\" (UID: \"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b\") " pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:29:47 crc kubenswrapper[4907]: I1129 15:29:47.221764 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:29:47 crc kubenswrapper[4907]: I1129 15:29:47.709270 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lk8wb"] Nov 29 15:29:47 crc kubenswrapper[4907]: I1129 15:29:47.945425 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lk8wb" event={"ID":"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b","Type":"ContainerStarted","Data":"b05fa61884089bd704c9edd8731c043fd23d31dfbf4a135b92057171fa4c519c"} Nov 29 15:29:48 crc kubenswrapper[4907]: I1129 15:29:48.971256 4907 generic.go:334] "Generic (PLEG): container finished" podID="d9be91a6-a6ed-4689-8f97-226e7fd2ce2b" containerID="e71da0ec6d9897ec705e5e8e1018c4a354d8e8d0c7c97808aa3583f01b035b01" exitCode=0 Nov 29 15:29:48 crc kubenswrapper[4907]: I1129 15:29:48.971392 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lk8wb" event={"ID":"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b","Type":"ContainerDied","Data":"e71da0ec6d9897ec705e5e8e1018c4a354d8e8d0c7c97808aa3583f01b035b01"} Nov 29 15:29:51 crc kubenswrapper[4907]: I1129 15:29:51.000641 4907 generic.go:334] "Generic (PLEG): container finished" podID="d9be91a6-a6ed-4689-8f97-226e7fd2ce2b" containerID="1b3b76c59142afacc9e0bdb44cf13e0f7a78007403262f7d9021c142ee92d830" exitCode=0 Nov 29 15:29:51 crc kubenswrapper[4907]: I1129 15:29:51.000713 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lk8wb" event={"ID":"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b","Type":"ContainerDied","Data":"1b3b76c59142afacc9e0bdb44cf13e0f7a78007403262f7d9021c142ee92d830"} Nov 29 15:29:52 crc kubenswrapper[4907]: I1129 15:29:52.020525 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lk8wb" event={"ID":"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b","Type":"ContainerStarted","Data":"f79c9894144ccd38d83622b1431f77c68a700fcd207ce9f9acbe3aa9888a3c2b"} Nov 29 15:29:52 crc kubenswrapper[4907]: I1129 15:29:52.047925 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lk8wb" podStartSLOduration=3.541221117 podStartE2EDuration="6.047906702s" podCreationTimestamp="2025-11-29 15:29:46 +0000 UTC" firstStartedPulling="2025-11-29 15:29:48.975160984 +0000 UTC m=+3686.961998666" lastFinishedPulling="2025-11-29 15:29:51.481846599 +0000 UTC m=+3689.468684251" observedRunningTime="2025-11-29 15:29:52.037828098 +0000 UTC m=+3690.024665790" watchObservedRunningTime="2025-11-29 15:29:52.047906702 +0000 UTC m=+3690.034744355" Nov 29 15:29:54 crc kubenswrapper[4907]: I1129 15:29:54.325889 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7pm6r"] Nov 29 15:29:54 crc kubenswrapper[4907]: I1129 15:29:54.331677 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:29:54 crc kubenswrapper[4907]: I1129 15:29:54.336509 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7pm6r"] Nov 29 15:29:54 crc kubenswrapper[4907]: I1129 15:29:54.433028 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277653db-feea-4592-9b50-736eea017ff9-catalog-content\") pod \"certified-operators-7pm6r\" (UID: \"277653db-feea-4592-9b50-736eea017ff9\") " pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:29:54 crc kubenswrapper[4907]: I1129 15:29:54.433173 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f5cq\" (UniqueName: \"kubernetes.io/projected/277653db-feea-4592-9b50-736eea017ff9-kube-api-access-8f5cq\") pod \"certified-operators-7pm6r\" (UID: \"277653db-feea-4592-9b50-736eea017ff9\") " pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:29:54 crc kubenswrapper[4907]: I1129 15:29:54.433243 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277653db-feea-4592-9b50-736eea017ff9-utilities\") pod \"certified-operators-7pm6r\" (UID: \"277653db-feea-4592-9b50-736eea017ff9\") " pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:29:54 crc kubenswrapper[4907]: I1129 15:29:54.535420 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277653db-feea-4592-9b50-736eea017ff9-catalog-content\") pod \"certified-operators-7pm6r\" (UID: \"277653db-feea-4592-9b50-736eea017ff9\") " pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:29:54 crc kubenswrapper[4907]: I1129 15:29:54.536002 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f5cq\" (UniqueName: \"kubernetes.io/projected/277653db-feea-4592-9b50-736eea017ff9-kube-api-access-8f5cq\") pod \"certified-operators-7pm6r\" (UID: \"277653db-feea-4592-9b50-736eea017ff9\") " pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:29:54 crc kubenswrapper[4907]: I1129 15:29:54.536066 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277653db-feea-4592-9b50-736eea017ff9-utilities\") pod \"certified-operators-7pm6r\" (UID: \"277653db-feea-4592-9b50-736eea017ff9\") " pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:29:54 crc kubenswrapper[4907]: I1129 15:29:54.535882 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277653db-feea-4592-9b50-736eea017ff9-catalog-content\") pod \"certified-operators-7pm6r\" (UID: \"277653db-feea-4592-9b50-736eea017ff9\") " pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:29:54 crc kubenswrapper[4907]: I1129 15:29:54.536785 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277653db-feea-4592-9b50-736eea017ff9-utilities\") pod \"certified-operators-7pm6r\" (UID: \"277653db-feea-4592-9b50-736eea017ff9\") " pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:29:54 crc kubenswrapper[4907]: I1129 15:29:54.568790 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f5cq\" (UniqueName: \"kubernetes.io/projected/277653db-feea-4592-9b50-736eea017ff9-kube-api-access-8f5cq\") pod \"certified-operators-7pm6r\" (UID: \"277653db-feea-4592-9b50-736eea017ff9\") " pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:29:54 crc kubenswrapper[4907]: I1129 15:29:54.663209 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:29:55 crc kubenswrapper[4907]: I1129 15:29:55.198288 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7pm6r"] Nov 29 15:29:56 crc kubenswrapper[4907]: I1129 15:29:56.066768 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pm6r" event={"ID":"277653db-feea-4592-9b50-736eea017ff9","Type":"ContainerDied","Data":"7b60d9b7e248135d9823ef8991ef2da1d10526b8295d84d38af7fc4dc37466ce"} Nov 29 15:29:56 crc kubenswrapper[4907]: I1129 15:29:56.069667 4907 generic.go:334] "Generic (PLEG): container finished" podID="277653db-feea-4592-9b50-736eea017ff9" containerID="7b60d9b7e248135d9823ef8991ef2da1d10526b8295d84d38af7fc4dc37466ce" exitCode=0 Nov 29 15:29:56 crc kubenswrapper[4907]: I1129 15:29:56.069810 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pm6r" event={"ID":"277653db-feea-4592-9b50-736eea017ff9","Type":"ContainerStarted","Data":"2fa9b571802991b35b94f3d3e62b21ebb29f18e2942ce5f6a8827764af0a96dd"} Nov 29 15:29:56 crc kubenswrapper[4907]: I1129 15:29:56.481329 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:29:56 crc kubenswrapper[4907]: E1129 15:29:56.482227 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:29:57 crc kubenswrapper[4907]: I1129 15:29:57.222986 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:29:57 crc kubenswrapper[4907]: I1129 15:29:57.223323 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:29:57 crc kubenswrapper[4907]: I1129 15:29:57.277145 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:29:58 crc kubenswrapper[4907]: I1129 15:29:58.093324 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pm6r" event={"ID":"277653db-feea-4592-9b50-736eea017ff9","Type":"ContainerStarted","Data":"3e9c3b7b7c212f6ebc9c29036103309987d62e96d9f6073aa62e0a6c02185633"} Nov 29 15:29:58 crc kubenswrapper[4907]: I1129 15:29:58.155484 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:29:59 crc kubenswrapper[4907]: I1129 15:29:59.112495 4907 generic.go:334] "Generic (PLEG): container finished" podID="277653db-feea-4592-9b50-736eea017ff9" containerID="3e9c3b7b7c212f6ebc9c29036103309987d62e96d9f6073aa62e0a6c02185633" exitCode=0 Nov 29 15:29:59 crc kubenswrapper[4907]: I1129 15:29:59.113416 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pm6r" event={"ID":"277653db-feea-4592-9b50-736eea017ff9","Type":"ContainerDied","Data":"3e9c3b7b7c212f6ebc9c29036103309987d62e96d9f6073aa62e0a6c02185633"} Nov 29 15:29:59 crc kubenswrapper[4907]: I1129 15:29:59.126843 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lk8wb"] Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.129801 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pm6r" event={"ID":"277653db-feea-4592-9b50-736eea017ff9","Type":"ContainerStarted","Data":"f9a6bdb839edb1ddb7a5d2bbb43d79022bb22c81578a1502e23b589e0f13ea82"} Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.129950 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lk8wb" podUID="d9be91a6-a6ed-4689-8f97-226e7fd2ce2b" containerName="registry-server" containerID="cri-o://f79c9894144ccd38d83622b1431f77c68a700fcd207ce9f9acbe3aa9888a3c2b" gracePeriod=2 Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.158396 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7pm6r" podStartSLOduration=2.655888128 podStartE2EDuration="6.158380238s" podCreationTimestamp="2025-11-29 15:29:54 +0000 UTC" firstStartedPulling="2025-11-29 15:29:56.069921575 +0000 UTC m=+3694.056759267" lastFinishedPulling="2025-11-29 15:29:59.572413715 +0000 UTC m=+3697.559251377" observedRunningTime="2025-11-29 15:30:00.154782827 +0000 UTC m=+3698.141620479" watchObservedRunningTime="2025-11-29 15:30:00.158380238 +0000 UTC m=+3698.145217890" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.193559 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27"] Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.195364 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.198048 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.198082 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.207083 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27"] Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.290049 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73b2ed79-7759-486a-bc40-c260b58ae2fa-config-volume\") pod \"collect-profiles-29407170-pcd27\" (UID: \"73b2ed79-7759-486a-bc40-c260b58ae2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.290101 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4czxq\" (UniqueName: \"kubernetes.io/projected/73b2ed79-7759-486a-bc40-c260b58ae2fa-kube-api-access-4czxq\") pod \"collect-profiles-29407170-pcd27\" (UID: \"73b2ed79-7759-486a-bc40-c260b58ae2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.290247 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73b2ed79-7759-486a-bc40-c260b58ae2fa-secret-volume\") pod \"collect-profiles-29407170-pcd27\" (UID: \"73b2ed79-7759-486a-bc40-c260b58ae2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.392473 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73b2ed79-7759-486a-bc40-c260b58ae2fa-config-volume\") pod \"collect-profiles-29407170-pcd27\" (UID: \"73b2ed79-7759-486a-bc40-c260b58ae2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.392840 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4czxq\" (UniqueName: \"kubernetes.io/projected/73b2ed79-7759-486a-bc40-c260b58ae2fa-kube-api-access-4czxq\") pod \"collect-profiles-29407170-pcd27\" (UID: \"73b2ed79-7759-486a-bc40-c260b58ae2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.392985 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73b2ed79-7759-486a-bc40-c260b58ae2fa-secret-volume\") pod \"collect-profiles-29407170-pcd27\" (UID: \"73b2ed79-7759-486a-bc40-c260b58ae2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.393182 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73b2ed79-7759-486a-bc40-c260b58ae2fa-config-volume\") pod \"collect-profiles-29407170-pcd27\" (UID: \"73b2ed79-7759-486a-bc40-c260b58ae2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.398268 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73b2ed79-7759-486a-bc40-c260b58ae2fa-secret-volume\") pod \"collect-profiles-29407170-pcd27\" (UID: \"73b2ed79-7759-486a-bc40-c260b58ae2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.434191 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4czxq\" (UniqueName: \"kubernetes.io/projected/73b2ed79-7759-486a-bc40-c260b58ae2fa-kube-api-access-4czxq\") pod \"collect-profiles-29407170-pcd27\" (UID: \"73b2ed79-7759-486a-bc40-c260b58ae2fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.541818 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.608522 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.700123 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-utilities\") pod \"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b\" (UID: \"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b\") " Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.700392 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf6pl\" (UniqueName: \"kubernetes.io/projected/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-kube-api-access-mf6pl\") pod \"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b\" (UID: \"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b\") " Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.700542 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-catalog-content\") pod \"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b\" (UID: \"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b\") " Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.700996 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-utilities" (OuterVolumeSpecName: "utilities") pod "d9be91a6-a6ed-4689-8f97-226e7fd2ce2b" (UID: "d9be91a6-a6ed-4689-8f97-226e7fd2ce2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.701119 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.722889 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-kube-api-access-mf6pl" (OuterVolumeSpecName: "kube-api-access-mf6pl") pod "d9be91a6-a6ed-4689-8f97-226e7fd2ce2b" (UID: "d9be91a6-a6ed-4689-8f97-226e7fd2ce2b"). InnerVolumeSpecName "kube-api-access-mf6pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.729062 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9be91a6-a6ed-4689-8f97-226e7fd2ce2b" (UID: "d9be91a6-a6ed-4689-8f97-226e7fd2ce2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.802751 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:30:00 crc kubenswrapper[4907]: I1129 15:30:00.802783 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf6pl\" (UniqueName: \"kubernetes.io/projected/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b-kube-api-access-mf6pl\") on node \"crc\" DevicePath \"\"" Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.034132 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27"] Nov 29 15:30:01 crc kubenswrapper[4907]: W1129 15:30:01.035063 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73b2ed79_7759_486a_bc40_c260b58ae2fa.slice/crio-93b356eabc64fb744f0079285730731b39d3d70137ddfbe090a605e7cde2b2c8 WatchSource:0}: Error finding container 93b356eabc64fb744f0079285730731b39d3d70137ddfbe090a605e7cde2b2c8: Status 404 returned error can't find the container with id 93b356eabc64fb744f0079285730731b39d3d70137ddfbe090a605e7cde2b2c8 Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.142939 4907 generic.go:334] "Generic (PLEG): container finished" podID="d9be91a6-a6ed-4689-8f97-226e7fd2ce2b" containerID="f79c9894144ccd38d83622b1431f77c68a700fcd207ce9f9acbe3aa9888a3c2b" exitCode=0 Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.143020 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lk8wb" event={"ID":"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b","Type":"ContainerDied","Data":"f79c9894144ccd38d83622b1431f77c68a700fcd207ce9f9acbe3aa9888a3c2b"} Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.143046 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lk8wb" event={"ID":"d9be91a6-a6ed-4689-8f97-226e7fd2ce2b","Type":"ContainerDied","Data":"b05fa61884089bd704c9edd8731c043fd23d31dfbf4a135b92057171fa4c519c"} Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.143063 4907 scope.go:117] "RemoveContainer" containerID="f79c9894144ccd38d83622b1431f77c68a700fcd207ce9f9acbe3aa9888a3c2b" Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.143068 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lk8wb" Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.144451 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" event={"ID":"73b2ed79-7759-486a-bc40-c260b58ae2fa","Type":"ContainerStarted","Data":"93b356eabc64fb744f0079285730731b39d3d70137ddfbe090a605e7cde2b2c8"} Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.212231 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lk8wb"] Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.212591 4907 scope.go:117] "RemoveContainer" containerID="1b3b76c59142afacc9e0bdb44cf13e0f7a78007403262f7d9021c142ee92d830" Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.222940 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lk8wb"] Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.265905 4907 scope.go:117] "RemoveContainer" containerID="e71da0ec6d9897ec705e5e8e1018c4a354d8e8d0c7c97808aa3583f01b035b01" Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.367872 4907 scope.go:117] "RemoveContainer" containerID="f79c9894144ccd38d83622b1431f77c68a700fcd207ce9f9acbe3aa9888a3c2b" Nov 29 15:30:01 crc kubenswrapper[4907]: E1129 15:30:01.373091 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f79c9894144ccd38d83622b1431f77c68a700fcd207ce9f9acbe3aa9888a3c2b\": container with ID starting with f79c9894144ccd38d83622b1431f77c68a700fcd207ce9f9acbe3aa9888a3c2b not found: ID does not exist" containerID="f79c9894144ccd38d83622b1431f77c68a700fcd207ce9f9acbe3aa9888a3c2b" Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.373142 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f79c9894144ccd38d83622b1431f77c68a700fcd207ce9f9acbe3aa9888a3c2b"} err="failed to get container status \"f79c9894144ccd38d83622b1431f77c68a700fcd207ce9f9acbe3aa9888a3c2b\": rpc error: code = NotFound desc = could not find container \"f79c9894144ccd38d83622b1431f77c68a700fcd207ce9f9acbe3aa9888a3c2b\": container with ID starting with f79c9894144ccd38d83622b1431f77c68a700fcd207ce9f9acbe3aa9888a3c2b not found: ID does not exist" Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.373165 4907 scope.go:117] "RemoveContainer" containerID="1b3b76c59142afacc9e0bdb44cf13e0f7a78007403262f7d9021c142ee92d830" Nov 29 15:30:01 crc kubenswrapper[4907]: E1129 15:30:01.376779 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b3b76c59142afacc9e0bdb44cf13e0f7a78007403262f7d9021c142ee92d830\": container with ID starting with 1b3b76c59142afacc9e0bdb44cf13e0f7a78007403262f7d9021c142ee92d830 not found: ID does not exist" containerID="1b3b76c59142afacc9e0bdb44cf13e0f7a78007403262f7d9021c142ee92d830" Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.376814 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b3b76c59142afacc9e0bdb44cf13e0f7a78007403262f7d9021c142ee92d830"} err="failed to get container status \"1b3b76c59142afacc9e0bdb44cf13e0f7a78007403262f7d9021c142ee92d830\": rpc error: code = NotFound desc = could not find container \"1b3b76c59142afacc9e0bdb44cf13e0f7a78007403262f7d9021c142ee92d830\": container with ID starting with 1b3b76c59142afacc9e0bdb44cf13e0f7a78007403262f7d9021c142ee92d830 not found: ID does not exist" Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.376831 4907 scope.go:117] "RemoveContainer" containerID="e71da0ec6d9897ec705e5e8e1018c4a354d8e8d0c7c97808aa3583f01b035b01" Nov 29 15:30:01 crc kubenswrapper[4907]: E1129 15:30:01.380739 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e71da0ec6d9897ec705e5e8e1018c4a354d8e8d0c7c97808aa3583f01b035b01\": container with ID starting with e71da0ec6d9897ec705e5e8e1018c4a354d8e8d0c7c97808aa3583f01b035b01 not found: ID does not exist" containerID="e71da0ec6d9897ec705e5e8e1018c4a354d8e8d0c7c97808aa3583f01b035b01" Nov 29 15:30:01 crc kubenswrapper[4907]: I1129 15:30:01.380773 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e71da0ec6d9897ec705e5e8e1018c4a354d8e8d0c7c97808aa3583f01b035b01"} err="failed to get container status \"e71da0ec6d9897ec705e5e8e1018c4a354d8e8d0c7c97808aa3583f01b035b01\": rpc error: code = NotFound desc = could not find container \"e71da0ec6d9897ec705e5e8e1018c4a354d8e8d0c7c97808aa3583f01b035b01\": container with ID starting with e71da0ec6d9897ec705e5e8e1018c4a354d8e8d0c7c97808aa3583f01b035b01 not found: ID does not exist" Nov 29 15:30:02 crc kubenswrapper[4907]: I1129 15:30:02.156101 4907 generic.go:334] "Generic (PLEG): container finished" podID="73b2ed79-7759-486a-bc40-c260b58ae2fa" containerID="400e58cf5254f729712a980e364110b1cb31120063a998829a3cd29bbbc59e9f" exitCode=0 Nov 29 15:30:02 crc kubenswrapper[4907]: I1129 15:30:02.156171 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" event={"ID":"73b2ed79-7759-486a-bc40-c260b58ae2fa","Type":"ContainerDied","Data":"400e58cf5254f729712a980e364110b1cb31120063a998829a3cd29bbbc59e9f"} Nov 29 15:30:02 crc kubenswrapper[4907]: I1129 15:30:02.510305 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9be91a6-a6ed-4689-8f97-226e7fd2ce2b" path="/var/lib/kubelet/pods/d9be91a6-a6ed-4689-8f97-226e7fd2ce2b/volumes" Nov 29 15:30:03 crc kubenswrapper[4907]: I1129 15:30:03.674468 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" Nov 29 15:30:03 crc kubenswrapper[4907]: I1129 15:30:03.777865 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73b2ed79-7759-486a-bc40-c260b58ae2fa-secret-volume\") pod \"73b2ed79-7759-486a-bc40-c260b58ae2fa\" (UID: \"73b2ed79-7759-486a-bc40-c260b58ae2fa\") " Nov 29 15:30:03 crc kubenswrapper[4907]: I1129 15:30:03.778141 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4czxq\" (UniqueName: \"kubernetes.io/projected/73b2ed79-7759-486a-bc40-c260b58ae2fa-kube-api-access-4czxq\") pod \"73b2ed79-7759-486a-bc40-c260b58ae2fa\" (UID: \"73b2ed79-7759-486a-bc40-c260b58ae2fa\") " Nov 29 15:30:03 crc kubenswrapper[4907]: I1129 15:30:03.778258 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73b2ed79-7759-486a-bc40-c260b58ae2fa-config-volume\") pod \"73b2ed79-7759-486a-bc40-c260b58ae2fa\" (UID: \"73b2ed79-7759-486a-bc40-c260b58ae2fa\") " Nov 29 15:30:03 crc kubenswrapper[4907]: I1129 15:30:03.780027 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73b2ed79-7759-486a-bc40-c260b58ae2fa-config-volume" (OuterVolumeSpecName: "config-volume") pod "73b2ed79-7759-486a-bc40-c260b58ae2fa" (UID: "73b2ed79-7759-486a-bc40-c260b58ae2fa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 15:30:03 crc kubenswrapper[4907]: I1129 15:30:03.786705 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73b2ed79-7759-486a-bc40-c260b58ae2fa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "73b2ed79-7759-486a-bc40-c260b58ae2fa" (UID: "73b2ed79-7759-486a-bc40-c260b58ae2fa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:30:03 crc kubenswrapper[4907]: I1129 15:30:03.788877 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73b2ed79-7759-486a-bc40-c260b58ae2fa-kube-api-access-4czxq" (OuterVolumeSpecName: "kube-api-access-4czxq") pod "73b2ed79-7759-486a-bc40-c260b58ae2fa" (UID: "73b2ed79-7759-486a-bc40-c260b58ae2fa"). InnerVolumeSpecName "kube-api-access-4czxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:30:03 crc kubenswrapper[4907]: I1129 15:30:03.881608 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73b2ed79-7759-486a-bc40-c260b58ae2fa-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 15:30:03 crc kubenswrapper[4907]: I1129 15:30:03.881642 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4czxq\" (UniqueName: \"kubernetes.io/projected/73b2ed79-7759-486a-bc40-c260b58ae2fa-kube-api-access-4czxq\") on node \"crc\" DevicePath \"\"" Nov 29 15:30:03 crc kubenswrapper[4907]: I1129 15:30:03.881651 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73b2ed79-7759-486a-bc40-c260b58ae2fa-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 15:30:04 crc kubenswrapper[4907]: I1129 15:30:04.182777 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" event={"ID":"73b2ed79-7759-486a-bc40-c260b58ae2fa","Type":"ContainerDied","Data":"93b356eabc64fb744f0079285730731b39d3d70137ddfbe090a605e7cde2b2c8"} Nov 29 15:30:04 crc kubenswrapper[4907]: I1129 15:30:04.182841 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93b356eabc64fb744f0079285730731b39d3d70137ddfbe090a605e7cde2b2c8" Nov 29 15:30:04 crc kubenswrapper[4907]: I1129 15:30:04.182917 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27" Nov 29 15:30:04 crc kubenswrapper[4907]: I1129 15:30:04.664528 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:30:04 crc kubenswrapper[4907]: I1129 15:30:04.665214 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:30:04 crc kubenswrapper[4907]: I1129 15:30:04.743746 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:30:04 crc kubenswrapper[4907]: I1129 15:30:04.798855 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf"] Nov 29 15:30:04 crc kubenswrapper[4907]: I1129 15:30:04.811652 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407125-8fwrf"] Nov 29 15:30:05 crc kubenswrapper[4907]: I1129 15:30:05.292282 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:30:05 crc kubenswrapper[4907]: I1129 15:30:05.902548 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7pm6r"] Nov 29 15:30:06 crc kubenswrapper[4907]: I1129 15:30:06.507110 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da3e28a2-d307-482f-b76f-a3b0d36bcc1a" path="/var/lib/kubelet/pods/da3e28a2-d307-482f-b76f-a3b0d36bcc1a/volumes" Nov 29 15:30:07 crc kubenswrapper[4907]: I1129 15:30:07.215047 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7pm6r" podUID="277653db-feea-4592-9b50-736eea017ff9" containerName="registry-server" containerID="cri-o://f9a6bdb839edb1ddb7a5d2bbb43d79022bb22c81578a1502e23b589e0f13ea82" gracePeriod=2 Nov 29 15:30:07 crc kubenswrapper[4907]: I1129 15:30:07.479942 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:30:07 crc kubenswrapper[4907]: E1129 15:30:07.480269 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:30:07 crc kubenswrapper[4907]: I1129 15:30:07.839357 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:30:07 crc kubenswrapper[4907]: I1129 15:30:07.902136 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f5cq\" (UniqueName: \"kubernetes.io/projected/277653db-feea-4592-9b50-736eea017ff9-kube-api-access-8f5cq\") pod \"277653db-feea-4592-9b50-736eea017ff9\" (UID: \"277653db-feea-4592-9b50-736eea017ff9\") " Nov 29 15:30:07 crc kubenswrapper[4907]: I1129 15:30:07.902390 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277653db-feea-4592-9b50-736eea017ff9-catalog-content\") pod \"277653db-feea-4592-9b50-736eea017ff9\" (UID: \"277653db-feea-4592-9b50-736eea017ff9\") " Nov 29 15:30:07 crc kubenswrapper[4907]: I1129 15:30:07.902473 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277653db-feea-4592-9b50-736eea017ff9-utilities\") pod \"277653db-feea-4592-9b50-736eea017ff9\" (UID: \"277653db-feea-4592-9b50-736eea017ff9\") " Nov 29 15:30:07 crc kubenswrapper[4907]: I1129 15:30:07.903352 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/277653db-feea-4592-9b50-736eea017ff9-utilities" (OuterVolumeSpecName: "utilities") pod "277653db-feea-4592-9b50-736eea017ff9" (UID: "277653db-feea-4592-9b50-736eea017ff9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:30:07 crc kubenswrapper[4907]: I1129 15:30:07.904123 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277653db-feea-4592-9b50-736eea017ff9-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:30:07 crc kubenswrapper[4907]: I1129 15:30:07.912683 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/277653db-feea-4592-9b50-736eea017ff9-kube-api-access-8f5cq" (OuterVolumeSpecName: "kube-api-access-8f5cq") pod "277653db-feea-4592-9b50-736eea017ff9" (UID: "277653db-feea-4592-9b50-736eea017ff9"). InnerVolumeSpecName "kube-api-access-8f5cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:30:07 crc kubenswrapper[4907]: I1129 15:30:07.961756 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/277653db-feea-4592-9b50-736eea017ff9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "277653db-feea-4592-9b50-736eea017ff9" (UID: "277653db-feea-4592-9b50-736eea017ff9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.006429 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277653db-feea-4592-9b50-736eea017ff9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.006497 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f5cq\" (UniqueName: \"kubernetes.io/projected/277653db-feea-4592-9b50-736eea017ff9-kube-api-access-8f5cq\") on node \"crc\" DevicePath \"\"" Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.231903 4907 generic.go:334] "Generic (PLEG): container finished" podID="277653db-feea-4592-9b50-736eea017ff9" containerID="f9a6bdb839edb1ddb7a5d2bbb43d79022bb22c81578a1502e23b589e0f13ea82" exitCode=0 Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.231970 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pm6r" event={"ID":"277653db-feea-4592-9b50-736eea017ff9","Type":"ContainerDied","Data":"f9a6bdb839edb1ddb7a5d2bbb43d79022bb22c81578a1502e23b589e0f13ea82"} Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.232033 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7pm6r" Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.232071 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7pm6r" event={"ID":"277653db-feea-4592-9b50-736eea017ff9","Type":"ContainerDied","Data":"2fa9b571802991b35b94f3d3e62b21ebb29f18e2942ce5f6a8827764af0a96dd"} Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.232116 4907 scope.go:117] "RemoveContainer" containerID="f9a6bdb839edb1ddb7a5d2bbb43d79022bb22c81578a1502e23b589e0f13ea82" Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.284694 4907 scope.go:117] "RemoveContainer" containerID="3e9c3b7b7c212f6ebc9c29036103309987d62e96d9f6073aa62e0a6c02185633" Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.322415 4907 scope.go:117] "RemoveContainer" containerID="7b60d9b7e248135d9823ef8991ef2da1d10526b8295d84d38af7fc4dc37466ce" Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.323237 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7pm6r"] Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.345740 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7pm6r"] Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.391104 4907 scope.go:117] "RemoveContainer" containerID="f9a6bdb839edb1ddb7a5d2bbb43d79022bb22c81578a1502e23b589e0f13ea82" Nov 29 15:30:08 crc kubenswrapper[4907]: E1129 15:30:08.391626 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9a6bdb839edb1ddb7a5d2bbb43d79022bb22c81578a1502e23b589e0f13ea82\": container with ID starting with f9a6bdb839edb1ddb7a5d2bbb43d79022bb22c81578a1502e23b589e0f13ea82 not found: ID does not exist" containerID="f9a6bdb839edb1ddb7a5d2bbb43d79022bb22c81578a1502e23b589e0f13ea82" Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.391676 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a6bdb839edb1ddb7a5d2bbb43d79022bb22c81578a1502e23b589e0f13ea82"} err="failed to get container status \"f9a6bdb839edb1ddb7a5d2bbb43d79022bb22c81578a1502e23b589e0f13ea82\": rpc error: code = NotFound desc = could not find container \"f9a6bdb839edb1ddb7a5d2bbb43d79022bb22c81578a1502e23b589e0f13ea82\": container with ID starting with f9a6bdb839edb1ddb7a5d2bbb43d79022bb22c81578a1502e23b589e0f13ea82 not found: ID does not exist" Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.391714 4907 scope.go:117] "RemoveContainer" containerID="3e9c3b7b7c212f6ebc9c29036103309987d62e96d9f6073aa62e0a6c02185633" Nov 29 15:30:08 crc kubenswrapper[4907]: E1129 15:30:08.392732 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e9c3b7b7c212f6ebc9c29036103309987d62e96d9f6073aa62e0a6c02185633\": container with ID starting with 3e9c3b7b7c212f6ebc9c29036103309987d62e96d9f6073aa62e0a6c02185633 not found: ID does not exist" containerID="3e9c3b7b7c212f6ebc9c29036103309987d62e96d9f6073aa62e0a6c02185633" Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.392804 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9c3b7b7c212f6ebc9c29036103309987d62e96d9f6073aa62e0a6c02185633"} err="failed to get container status \"3e9c3b7b7c212f6ebc9c29036103309987d62e96d9f6073aa62e0a6c02185633\": rpc error: code = NotFound desc = could not find container \"3e9c3b7b7c212f6ebc9c29036103309987d62e96d9f6073aa62e0a6c02185633\": container with ID starting with 3e9c3b7b7c212f6ebc9c29036103309987d62e96d9f6073aa62e0a6c02185633 not found: ID does not exist" Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.392852 4907 scope.go:117] "RemoveContainer" containerID="7b60d9b7e248135d9823ef8991ef2da1d10526b8295d84d38af7fc4dc37466ce" Nov 29 15:30:08 crc kubenswrapper[4907]: E1129 15:30:08.393346 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b60d9b7e248135d9823ef8991ef2da1d10526b8295d84d38af7fc4dc37466ce\": container with ID starting with 7b60d9b7e248135d9823ef8991ef2da1d10526b8295d84d38af7fc4dc37466ce not found: ID does not exist" containerID="7b60d9b7e248135d9823ef8991ef2da1d10526b8295d84d38af7fc4dc37466ce" Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.393400 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b60d9b7e248135d9823ef8991ef2da1d10526b8295d84d38af7fc4dc37466ce"} err="failed to get container status \"7b60d9b7e248135d9823ef8991ef2da1d10526b8295d84d38af7fc4dc37466ce\": rpc error: code = NotFound desc = could not find container \"7b60d9b7e248135d9823ef8991ef2da1d10526b8295d84d38af7fc4dc37466ce\": container with ID starting with 7b60d9b7e248135d9823ef8991ef2da1d10526b8295d84d38af7fc4dc37466ce not found: ID does not exist" Nov 29 15:30:08 crc kubenswrapper[4907]: I1129 15:30:08.501888 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="277653db-feea-4592-9b50-736eea017ff9" path="/var/lib/kubelet/pods/277653db-feea-4592-9b50-736eea017ff9/volumes" Nov 29 15:30:19 crc kubenswrapper[4907]: I1129 15:30:19.480812 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:30:19 crc kubenswrapper[4907]: E1129 15:30:19.481769 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:30:30 crc kubenswrapper[4907]: I1129 15:30:30.043834 4907 scope.go:117] "RemoveContainer" containerID="bc3a1b63452d37ace577ca8b60f10ee63713a31408b237fab5e498dce9eaa921" Nov 29 15:30:34 crc kubenswrapper[4907]: I1129 15:30:34.480140 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:30:34 crc kubenswrapper[4907]: E1129 15:30:34.481146 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:30:46 crc kubenswrapper[4907]: I1129 15:30:46.479538 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:30:46 crc kubenswrapper[4907]: E1129 15:30:46.480176 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:30:57 crc kubenswrapper[4907]: I1129 15:30:57.479996 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:30:57 crc kubenswrapper[4907]: E1129 15:30:57.480982 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:31:09 crc kubenswrapper[4907]: I1129 15:31:09.480522 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:31:09 crc kubenswrapper[4907]: E1129 15:31:09.481329 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:31:23 crc kubenswrapper[4907]: I1129 15:31:23.479473 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:31:23 crc kubenswrapper[4907]: E1129 15:31:23.480380 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:31:36 crc kubenswrapper[4907]: I1129 15:31:36.480876 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:31:36 crc kubenswrapper[4907]: E1129 15:31:36.481997 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:31:48 crc kubenswrapper[4907]: I1129 15:31:48.480445 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:31:48 crc kubenswrapper[4907]: E1129 15:31:48.481317 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:31:59 crc kubenswrapper[4907]: I1129 15:31:59.480432 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:31:59 crc kubenswrapper[4907]: E1129 15:31:59.481847 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:32:13 crc kubenswrapper[4907]: I1129 15:32:13.481464 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:32:13 crc kubenswrapper[4907]: E1129 15:32:13.482492 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:32:24 crc kubenswrapper[4907]: I1129 15:32:24.480180 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:32:24 crc kubenswrapper[4907]: E1129 15:32:24.482012 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:32:38 crc kubenswrapper[4907]: I1129 15:32:38.480061 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:32:38 crc kubenswrapper[4907]: E1129 15:32:38.481071 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:32:51 crc kubenswrapper[4907]: I1129 15:32:51.480405 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:32:51 crc kubenswrapper[4907]: E1129 15:32:51.481259 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:33:03 crc kubenswrapper[4907]: I1129 15:33:03.479173 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:33:03 crc kubenswrapper[4907]: E1129 15:33:03.480034 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:33:15 crc kubenswrapper[4907]: I1129 15:33:15.480122 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:33:15 crc kubenswrapper[4907]: E1129 15:33:15.481155 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:33:27 crc kubenswrapper[4907]: I1129 15:33:27.479636 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:33:27 crc kubenswrapper[4907]: E1129 15:33:27.480456 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:33:38 crc kubenswrapper[4907]: I1129 15:33:38.480363 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:33:38 crc kubenswrapper[4907]: E1129 15:33:38.481331 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:33:50 crc kubenswrapper[4907]: I1129 15:33:50.479240 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:33:50 crc kubenswrapper[4907]: E1129 15:33:50.480098 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:34:01 crc kubenswrapper[4907]: I1129 15:34:01.479600 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:34:01 crc kubenswrapper[4907]: E1129 15:34:01.480390 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:34:13 crc kubenswrapper[4907]: I1129 15:34:13.479647 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:34:13 crc kubenswrapper[4907]: E1129 15:34:13.480708 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:34:24 crc kubenswrapper[4907]: I1129 15:34:24.480065 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:34:24 crc kubenswrapper[4907]: E1129 15:34:24.481242 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:34:36 crc kubenswrapper[4907]: I1129 15:34:36.480071 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:34:36 crc kubenswrapper[4907]: I1129 15:34:36.992632 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"3764b4c9482c7833c66ca1145889329f3e46a38783410ffb48f7726fd6484816"} Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.301608 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5qdqh"] Nov 29 15:36:04 crc kubenswrapper[4907]: E1129 15:36:04.307126 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9be91a6-a6ed-4689-8f97-226e7fd2ce2b" containerName="extract-content" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.307277 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9be91a6-a6ed-4689-8f97-226e7fd2ce2b" containerName="extract-content" Nov 29 15:36:04 crc kubenswrapper[4907]: E1129 15:36:04.307406 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277653db-feea-4592-9b50-736eea017ff9" containerName="extract-content" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.307547 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="277653db-feea-4592-9b50-736eea017ff9" containerName="extract-content" Nov 29 15:36:04 crc kubenswrapper[4907]: E1129 15:36:04.307673 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277653db-feea-4592-9b50-736eea017ff9" containerName="extract-utilities" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.307936 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="277653db-feea-4592-9b50-736eea017ff9" containerName="extract-utilities" Nov 29 15:36:04 crc kubenswrapper[4907]: E1129 15:36:04.308097 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277653db-feea-4592-9b50-736eea017ff9" containerName="registry-server" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.308206 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="277653db-feea-4592-9b50-736eea017ff9" containerName="registry-server" Nov 29 15:36:04 crc kubenswrapper[4907]: E1129 15:36:04.308327 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b2ed79-7759-486a-bc40-c260b58ae2fa" containerName="collect-profiles" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.308441 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b2ed79-7759-486a-bc40-c260b58ae2fa" containerName="collect-profiles" Nov 29 15:36:04 crc kubenswrapper[4907]: E1129 15:36:04.308596 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9be91a6-a6ed-4689-8f97-226e7fd2ce2b" containerName="extract-utilities" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.308699 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9be91a6-a6ed-4689-8f97-226e7fd2ce2b" containerName="extract-utilities" Nov 29 15:36:04 crc kubenswrapper[4907]: E1129 15:36:04.308837 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9be91a6-a6ed-4689-8f97-226e7fd2ce2b" containerName="registry-server" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.308940 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9be91a6-a6ed-4689-8f97-226e7fd2ce2b" containerName="registry-server" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.309536 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="277653db-feea-4592-9b50-736eea017ff9" containerName="registry-server" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.309713 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9be91a6-a6ed-4689-8f97-226e7fd2ce2b" containerName="registry-server" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.309838 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="73b2ed79-7759-486a-bc40-c260b58ae2fa" containerName="collect-profiles" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.313184 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.324667 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qdqh"] Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.343359 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-utilities\") pod \"redhat-operators-5qdqh\" (UID: \"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343\") " pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.343769 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vszs4\" (UniqueName: \"kubernetes.io/projected/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-kube-api-access-vszs4\") pod \"redhat-operators-5qdqh\" (UID: \"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343\") " pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.343898 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-catalog-content\") pod \"redhat-operators-5qdqh\" (UID: \"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343\") " pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.446725 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-utilities\") pod \"redhat-operators-5qdqh\" (UID: \"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343\") " pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.447223 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vszs4\" (UniqueName: \"kubernetes.io/projected/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-kube-api-access-vszs4\") pod \"redhat-operators-5qdqh\" (UID: \"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343\") " pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.447339 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-catalog-content\") pod \"redhat-operators-5qdqh\" (UID: \"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343\") " pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.447434 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-utilities\") pod \"redhat-operators-5qdqh\" (UID: \"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343\") " pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.447616 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-catalog-content\") pod \"redhat-operators-5qdqh\" (UID: \"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343\") " pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.471587 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vszs4\" (UniqueName: \"kubernetes.io/projected/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-kube-api-access-vszs4\") pod \"redhat-operators-5qdqh\" (UID: \"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343\") " pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:04 crc kubenswrapper[4907]: I1129 15:36:04.643353 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:05 crc kubenswrapper[4907]: I1129 15:36:05.789628 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qdqh"] Nov 29 15:36:06 crc kubenswrapper[4907]: I1129 15:36:06.209556 4907 generic.go:334] "Generic (PLEG): container finished" podID="5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" containerID="09dfecd172cb66085f6add7ae2914d5aecc5760a529dd56adca13fd2af44b2db" exitCode=0 Nov 29 15:36:06 crc kubenswrapper[4907]: I1129 15:36:06.209899 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qdqh" event={"ID":"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343","Type":"ContainerDied","Data":"09dfecd172cb66085f6add7ae2914d5aecc5760a529dd56adca13fd2af44b2db"} Nov 29 15:36:06 crc kubenswrapper[4907]: I1129 15:36:06.209937 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qdqh" event={"ID":"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343","Type":"ContainerStarted","Data":"274461ca97c14713e2844af484c181f963f81735737b5fa1f5c12f42d4d9250f"} Nov 29 15:36:06 crc kubenswrapper[4907]: I1129 15:36:06.213343 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 15:36:08 crc kubenswrapper[4907]: I1129 15:36:08.239364 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qdqh" event={"ID":"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343","Type":"ContainerStarted","Data":"dfab63eb35358ea64ac02414294eb1d4f6de0aaade1115ce941765df3a8c2bfe"} Nov 29 15:36:11 crc kubenswrapper[4907]: I1129 15:36:11.300598 4907 generic.go:334] "Generic (PLEG): container finished" podID="5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" containerID="dfab63eb35358ea64ac02414294eb1d4f6de0aaade1115ce941765df3a8c2bfe" exitCode=0 Nov 29 15:36:11 crc kubenswrapper[4907]: I1129 15:36:11.300731 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qdqh" event={"ID":"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343","Type":"ContainerDied","Data":"dfab63eb35358ea64ac02414294eb1d4f6de0aaade1115ce941765df3a8c2bfe"} Nov 29 15:36:12 crc kubenswrapper[4907]: I1129 15:36:12.317147 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qdqh" event={"ID":"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343","Type":"ContainerStarted","Data":"961dd9ec5ad857564bc501715e3932e13467a6a9b2aecb817a6a51c16537fda2"} Nov 29 15:36:12 crc kubenswrapper[4907]: I1129 15:36:12.360118 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5qdqh" podStartSLOduration=2.7305563729999998 podStartE2EDuration="8.36007629s" podCreationTimestamp="2025-11-29 15:36:04 +0000 UTC" firstStartedPulling="2025-11-29 15:36:06.212944267 +0000 UTC m=+4064.199781949" lastFinishedPulling="2025-11-29 15:36:11.842464194 +0000 UTC m=+4069.829301866" observedRunningTime="2025-11-29 15:36:12.353965318 +0000 UTC m=+4070.340803000" watchObservedRunningTime="2025-11-29 15:36:12.36007629 +0000 UTC m=+4070.346913952" Nov 29 15:36:14 crc kubenswrapper[4907]: I1129 15:36:14.644286 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:14 crc kubenswrapper[4907]: I1129 15:36:14.644601 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:15 crc kubenswrapper[4907]: I1129 15:36:15.707852 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5qdqh" podUID="5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" containerName="registry-server" probeResult="failure" output=< Nov 29 15:36:15 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 15:36:15 crc kubenswrapper[4907]: > Nov 29 15:36:24 crc kubenswrapper[4907]: I1129 15:36:24.730184 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:24 crc kubenswrapper[4907]: I1129 15:36:24.816691 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:24 crc kubenswrapper[4907]: I1129 15:36:24.983971 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5qdqh"] Nov 29 15:36:26 crc kubenswrapper[4907]: I1129 15:36:26.502646 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5qdqh" podUID="5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" containerName="registry-server" containerID="cri-o://961dd9ec5ad857564bc501715e3932e13467a6a9b2aecb817a6a51c16537fda2" gracePeriod=2 Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.107925 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.121776 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-catalog-content\") pod \"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343\" (UID: \"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343\") " Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.122248 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vszs4\" (UniqueName: \"kubernetes.io/projected/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-kube-api-access-vszs4\") pod \"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343\" (UID: \"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343\") " Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.122659 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-utilities\") pod \"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343\" (UID: \"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343\") " Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.123895 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-utilities" (OuterVolumeSpecName: "utilities") pod "5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" (UID: "5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.133520 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-kube-api-access-vszs4" (OuterVolumeSpecName: "kube-api-access-vszs4") pod "5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" (UID: "5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343"). InnerVolumeSpecName "kube-api-access-vszs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.227004 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.227079 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vszs4\" (UniqueName: \"kubernetes.io/projected/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-kube-api-access-vszs4\") on node \"crc\" DevicePath \"\"" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.273366 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" (UID: "5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.329979 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.518631 4907 generic.go:334] "Generic (PLEG): container finished" podID="5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" containerID="961dd9ec5ad857564bc501715e3932e13467a6a9b2aecb817a6a51c16537fda2" exitCode=0 Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.518694 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qdqh" event={"ID":"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343","Type":"ContainerDied","Data":"961dd9ec5ad857564bc501715e3932e13467a6a9b2aecb817a6a51c16537fda2"} Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.518800 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qdqh" event={"ID":"5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343","Type":"ContainerDied","Data":"274461ca97c14713e2844af484c181f963f81735737b5fa1f5c12f42d4d9250f"} Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.518843 4907 scope.go:117] "RemoveContainer" containerID="961dd9ec5ad857564bc501715e3932e13467a6a9b2aecb817a6a51c16537fda2" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.518725 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qdqh" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.551631 4907 scope.go:117] "RemoveContainer" containerID="dfab63eb35358ea64ac02414294eb1d4f6de0aaade1115ce941765df3a8c2bfe" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.597254 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5qdqh"] Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.608874 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5qdqh"] Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.612185 4907 scope.go:117] "RemoveContainer" containerID="09dfecd172cb66085f6add7ae2914d5aecc5760a529dd56adca13fd2af44b2db" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.665114 4907 scope.go:117] "RemoveContainer" containerID="961dd9ec5ad857564bc501715e3932e13467a6a9b2aecb817a6a51c16537fda2" Nov 29 15:36:27 crc kubenswrapper[4907]: E1129 15:36:27.667717 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"961dd9ec5ad857564bc501715e3932e13467a6a9b2aecb817a6a51c16537fda2\": container with ID starting with 961dd9ec5ad857564bc501715e3932e13467a6a9b2aecb817a6a51c16537fda2 not found: ID does not exist" containerID="961dd9ec5ad857564bc501715e3932e13467a6a9b2aecb817a6a51c16537fda2" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.667760 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"961dd9ec5ad857564bc501715e3932e13467a6a9b2aecb817a6a51c16537fda2"} err="failed to get container status \"961dd9ec5ad857564bc501715e3932e13467a6a9b2aecb817a6a51c16537fda2\": rpc error: code = NotFound desc = could not find container \"961dd9ec5ad857564bc501715e3932e13467a6a9b2aecb817a6a51c16537fda2\": container with ID starting with 961dd9ec5ad857564bc501715e3932e13467a6a9b2aecb817a6a51c16537fda2 not found: ID does not exist" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.667786 4907 scope.go:117] "RemoveContainer" containerID="dfab63eb35358ea64ac02414294eb1d4f6de0aaade1115ce941765df3a8c2bfe" Nov 29 15:36:27 crc kubenswrapper[4907]: E1129 15:36:27.668318 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfab63eb35358ea64ac02414294eb1d4f6de0aaade1115ce941765df3a8c2bfe\": container with ID starting with dfab63eb35358ea64ac02414294eb1d4f6de0aaade1115ce941765df3a8c2bfe not found: ID does not exist" containerID="dfab63eb35358ea64ac02414294eb1d4f6de0aaade1115ce941765df3a8c2bfe" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.668348 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfab63eb35358ea64ac02414294eb1d4f6de0aaade1115ce941765df3a8c2bfe"} err="failed to get container status \"dfab63eb35358ea64ac02414294eb1d4f6de0aaade1115ce941765df3a8c2bfe\": rpc error: code = NotFound desc = could not find container \"dfab63eb35358ea64ac02414294eb1d4f6de0aaade1115ce941765df3a8c2bfe\": container with ID starting with dfab63eb35358ea64ac02414294eb1d4f6de0aaade1115ce941765df3a8c2bfe not found: ID does not exist" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.668367 4907 scope.go:117] "RemoveContainer" containerID="09dfecd172cb66085f6add7ae2914d5aecc5760a529dd56adca13fd2af44b2db" Nov 29 15:36:27 crc kubenswrapper[4907]: E1129 15:36:27.669285 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09dfecd172cb66085f6add7ae2914d5aecc5760a529dd56adca13fd2af44b2db\": container with ID starting with 09dfecd172cb66085f6add7ae2914d5aecc5760a529dd56adca13fd2af44b2db not found: ID does not exist" containerID="09dfecd172cb66085f6add7ae2914d5aecc5760a529dd56adca13fd2af44b2db" Nov 29 15:36:27 crc kubenswrapper[4907]: I1129 15:36:27.669313 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09dfecd172cb66085f6add7ae2914d5aecc5760a529dd56adca13fd2af44b2db"} err="failed to get container status \"09dfecd172cb66085f6add7ae2914d5aecc5760a529dd56adca13fd2af44b2db\": rpc error: code = NotFound desc = could not find container \"09dfecd172cb66085f6add7ae2914d5aecc5760a529dd56adca13fd2af44b2db\": container with ID starting with 09dfecd172cb66085f6add7ae2914d5aecc5760a529dd56adca13fd2af44b2db not found: ID does not exist" Nov 29 15:36:28 crc kubenswrapper[4907]: I1129 15:36:28.495872 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" path="/var/lib/kubelet/pods/5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343/volumes" Nov 29 15:36:58 crc kubenswrapper[4907]: I1129 15:36:58.490381 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:36:58 crc kubenswrapper[4907]: I1129 15:36:58.491100 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:37:28 crc kubenswrapper[4907]: I1129 15:37:28.490023 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:37:28 crc kubenswrapper[4907]: I1129 15:37:28.490551 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:37:58 crc kubenswrapper[4907]: I1129 15:37:58.490205 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:37:58 crc kubenswrapper[4907]: I1129 15:37:58.490770 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:37:58 crc kubenswrapper[4907]: I1129 15:37:58.494852 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 15:37:58 crc kubenswrapper[4907]: I1129 15:37:58.496015 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3764b4c9482c7833c66ca1145889329f3e46a38783410ffb48f7726fd6484816"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 15:37:58 crc kubenswrapper[4907]: I1129 15:37:58.496125 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://3764b4c9482c7833c66ca1145889329f3e46a38783410ffb48f7726fd6484816" gracePeriod=600 Nov 29 15:37:58 crc kubenswrapper[4907]: I1129 15:37:58.827449 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="3764b4c9482c7833c66ca1145889329f3e46a38783410ffb48f7726fd6484816" exitCode=0 Nov 29 15:37:58 crc kubenswrapper[4907]: I1129 15:37:58.827508 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"3764b4c9482c7833c66ca1145889329f3e46a38783410ffb48f7726fd6484816"} Nov 29 15:37:58 crc kubenswrapper[4907]: I1129 15:37:58.827765 4907 scope.go:117] "RemoveContainer" containerID="01e654261a46e6c953cc249ff57f365bb7ef08a25d3fbd7a3a9faacd3f772ff2" Nov 29 15:37:59 crc kubenswrapper[4907]: I1129 15:37:59.844802 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e"} Nov 29 15:39:19 crc kubenswrapper[4907]: I1129 15:39:19.427325 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kdr9r"] Nov 29 15:39:19 crc kubenswrapper[4907]: E1129 15:39:19.428677 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" containerName="extract-utilities" Nov 29 15:39:19 crc kubenswrapper[4907]: I1129 15:39:19.428694 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" containerName="extract-utilities" Nov 29 15:39:19 crc kubenswrapper[4907]: E1129 15:39:19.428734 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" containerName="extract-content" Nov 29 15:39:19 crc kubenswrapper[4907]: I1129 15:39:19.428742 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" containerName="extract-content" Nov 29 15:39:19 crc kubenswrapper[4907]: E1129 15:39:19.428765 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" containerName="registry-server" Nov 29 15:39:19 crc kubenswrapper[4907]: I1129 15:39:19.428772 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" containerName="registry-server" Nov 29 15:39:19 crc kubenswrapper[4907]: I1129 15:39:19.429073 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ebc4c2d-9cb8-43dd-9427-7f8cf3d03343" containerName="registry-server" Nov 29 15:39:19 crc kubenswrapper[4907]: I1129 15:39:19.431238 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:19 crc kubenswrapper[4907]: I1129 15:39:19.467828 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kdr9r"] Nov 29 15:39:19 crc kubenswrapper[4907]: I1129 15:39:19.474178 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq449\" (UniqueName: \"kubernetes.io/projected/442c332f-badd-4aee-8941-cb0e917aeb78-kube-api-access-bq449\") pod \"community-operators-kdr9r\" (UID: \"442c332f-badd-4aee-8941-cb0e917aeb78\") " pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:19 crc kubenswrapper[4907]: I1129 15:39:19.474247 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/442c332f-badd-4aee-8941-cb0e917aeb78-catalog-content\") pod \"community-operators-kdr9r\" (UID: \"442c332f-badd-4aee-8941-cb0e917aeb78\") " pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:19 crc kubenswrapper[4907]: I1129 15:39:19.474352 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/442c332f-badd-4aee-8941-cb0e917aeb78-utilities\") pod \"community-operators-kdr9r\" (UID: \"442c332f-badd-4aee-8941-cb0e917aeb78\") " pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:19 crc kubenswrapper[4907]: I1129 15:39:19.576607 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/442c332f-badd-4aee-8941-cb0e917aeb78-utilities\") pod \"community-operators-kdr9r\" (UID: \"442c332f-badd-4aee-8941-cb0e917aeb78\") " pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:19 crc kubenswrapper[4907]: I1129 15:39:19.576964 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq449\" (UniqueName: \"kubernetes.io/projected/442c332f-badd-4aee-8941-cb0e917aeb78-kube-api-access-bq449\") pod \"community-operators-kdr9r\" (UID: \"442c332f-badd-4aee-8941-cb0e917aeb78\") " pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:19 crc kubenswrapper[4907]: I1129 15:39:19.577018 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/442c332f-badd-4aee-8941-cb0e917aeb78-catalog-content\") pod \"community-operators-kdr9r\" (UID: \"442c332f-badd-4aee-8941-cb0e917aeb78\") " pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:19 crc kubenswrapper[4907]: I1129 15:39:19.577231 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/442c332f-badd-4aee-8941-cb0e917aeb78-utilities\") pod \"community-operators-kdr9r\" (UID: \"442c332f-badd-4aee-8941-cb0e917aeb78\") " pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:19 crc kubenswrapper[4907]: I1129 15:39:19.578690 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/442c332f-badd-4aee-8941-cb0e917aeb78-catalog-content\") pod \"community-operators-kdr9r\" (UID: \"442c332f-badd-4aee-8941-cb0e917aeb78\") " pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:20 crc kubenswrapper[4907]: I1129 15:39:20.146255 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq449\" (UniqueName: \"kubernetes.io/projected/442c332f-badd-4aee-8941-cb0e917aeb78-kube-api-access-bq449\") pod \"community-operators-kdr9r\" (UID: \"442c332f-badd-4aee-8941-cb0e917aeb78\") " pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:20 crc kubenswrapper[4907]: I1129 15:39:20.382476 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:20 crc kubenswrapper[4907]: I1129 15:39:20.847935 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kdr9r"] Nov 29 15:39:20 crc kubenswrapper[4907]: I1129 15:39:20.901808 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdr9r" event={"ID":"442c332f-badd-4aee-8941-cb0e917aeb78","Type":"ContainerStarted","Data":"4102c347eaba6941dc1cfbfe2fcb835da24891bf7a76abfb599285f0f0c11e17"} Nov 29 15:39:21 crc kubenswrapper[4907]: I1129 15:39:21.924694 4907 generic.go:334] "Generic (PLEG): container finished" podID="442c332f-badd-4aee-8941-cb0e917aeb78" containerID="cb31865da17ae14c721cf321259c84f906283da577cd725f3f457d4d77bacfab" exitCode=0 Nov 29 15:39:21 crc kubenswrapper[4907]: I1129 15:39:21.924855 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdr9r" event={"ID":"442c332f-badd-4aee-8941-cb0e917aeb78","Type":"ContainerDied","Data":"cb31865da17ae14c721cf321259c84f906283da577cd725f3f457d4d77bacfab"} Nov 29 15:39:23 crc kubenswrapper[4907]: I1129 15:39:23.964754 4907 generic.go:334] "Generic (PLEG): container finished" podID="442c332f-badd-4aee-8941-cb0e917aeb78" containerID="6e3c543cba82f6a2febc666ce3e0de6fcb5e9789a6cfe28eacfaa0a011bc81ae" exitCode=0 Nov 29 15:39:23 crc kubenswrapper[4907]: I1129 15:39:23.964879 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdr9r" event={"ID":"442c332f-badd-4aee-8941-cb0e917aeb78","Type":"ContainerDied","Data":"6e3c543cba82f6a2febc666ce3e0de6fcb5e9789a6cfe28eacfaa0a011bc81ae"} Nov 29 15:39:24 crc kubenswrapper[4907]: I1129 15:39:24.981293 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdr9r" event={"ID":"442c332f-badd-4aee-8941-cb0e917aeb78","Type":"ContainerStarted","Data":"b7e63487c0f0e3cc5b7a6b5000d1be88f9f1be0d2307f8d48843989cfad26b14"} Nov 29 15:39:25 crc kubenswrapper[4907]: I1129 15:39:25.013902 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kdr9r" podStartSLOduration=3.491104057 podStartE2EDuration="6.01386187s" podCreationTimestamp="2025-11-29 15:39:19 +0000 UTC" firstStartedPulling="2025-11-29 15:39:21.928126352 +0000 UTC m=+4259.914964044" lastFinishedPulling="2025-11-29 15:39:24.450884165 +0000 UTC m=+4262.437721857" observedRunningTime="2025-11-29 15:39:25.005419572 +0000 UTC m=+4262.992257254" watchObservedRunningTime="2025-11-29 15:39:25.01386187 +0000 UTC m=+4263.000699562" Nov 29 15:39:30 crc kubenswrapper[4907]: I1129 15:39:30.383852 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:30 crc kubenswrapper[4907]: I1129 15:39:30.384312 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:30 crc kubenswrapper[4907]: I1129 15:39:30.437984 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:31 crc kubenswrapper[4907]: I1129 15:39:31.123610 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:31 crc kubenswrapper[4907]: I1129 15:39:31.190154 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kdr9r"] Nov 29 15:39:33 crc kubenswrapper[4907]: I1129 15:39:33.069593 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kdr9r" podUID="442c332f-badd-4aee-8941-cb0e917aeb78" containerName="registry-server" containerID="cri-o://b7e63487c0f0e3cc5b7a6b5000d1be88f9f1be0d2307f8d48843989cfad26b14" gracePeriod=2 Nov 29 15:39:34 crc kubenswrapper[4907]: I1129 15:39:34.085806 4907 generic.go:334] "Generic (PLEG): container finished" podID="442c332f-badd-4aee-8941-cb0e917aeb78" containerID="b7e63487c0f0e3cc5b7a6b5000d1be88f9f1be0d2307f8d48843989cfad26b14" exitCode=0 Nov 29 15:39:34 crc kubenswrapper[4907]: I1129 15:39:34.085914 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdr9r" event={"ID":"442c332f-badd-4aee-8941-cb0e917aeb78","Type":"ContainerDied","Data":"b7e63487c0f0e3cc5b7a6b5000d1be88f9f1be0d2307f8d48843989cfad26b14"} Nov 29 15:39:34 crc kubenswrapper[4907]: I1129 15:39:34.354580 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:34 crc kubenswrapper[4907]: I1129 15:39:34.506206 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq449\" (UniqueName: \"kubernetes.io/projected/442c332f-badd-4aee-8941-cb0e917aeb78-kube-api-access-bq449\") pod \"442c332f-badd-4aee-8941-cb0e917aeb78\" (UID: \"442c332f-badd-4aee-8941-cb0e917aeb78\") " Nov 29 15:39:34 crc kubenswrapper[4907]: I1129 15:39:34.506243 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/442c332f-badd-4aee-8941-cb0e917aeb78-utilities\") pod \"442c332f-badd-4aee-8941-cb0e917aeb78\" (UID: \"442c332f-badd-4aee-8941-cb0e917aeb78\") " Nov 29 15:39:34 crc kubenswrapper[4907]: I1129 15:39:34.506483 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/442c332f-badd-4aee-8941-cb0e917aeb78-catalog-content\") pod \"442c332f-badd-4aee-8941-cb0e917aeb78\" (UID: \"442c332f-badd-4aee-8941-cb0e917aeb78\") " Nov 29 15:39:34 crc kubenswrapper[4907]: I1129 15:39:34.534308 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/442c332f-badd-4aee-8941-cb0e917aeb78-kube-api-access-bq449" (OuterVolumeSpecName: "kube-api-access-bq449") pod "442c332f-badd-4aee-8941-cb0e917aeb78" (UID: "442c332f-badd-4aee-8941-cb0e917aeb78"). InnerVolumeSpecName "kube-api-access-bq449". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:39:34 crc kubenswrapper[4907]: I1129 15:39:34.546843 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/442c332f-badd-4aee-8941-cb0e917aeb78-utilities" (OuterVolumeSpecName: "utilities") pod "442c332f-badd-4aee-8941-cb0e917aeb78" (UID: "442c332f-badd-4aee-8941-cb0e917aeb78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:39:34 crc kubenswrapper[4907]: I1129 15:39:34.596950 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/442c332f-badd-4aee-8941-cb0e917aeb78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "442c332f-badd-4aee-8941-cb0e917aeb78" (UID: "442c332f-badd-4aee-8941-cb0e917aeb78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:39:34 crc kubenswrapper[4907]: I1129 15:39:34.609076 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq449\" (UniqueName: \"kubernetes.io/projected/442c332f-badd-4aee-8941-cb0e917aeb78-kube-api-access-bq449\") on node \"crc\" DevicePath \"\"" Nov 29 15:39:34 crc kubenswrapper[4907]: I1129 15:39:34.609118 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/442c332f-badd-4aee-8941-cb0e917aeb78-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:39:34 crc kubenswrapper[4907]: I1129 15:39:34.609127 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/442c332f-badd-4aee-8941-cb0e917aeb78-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:39:35 crc kubenswrapper[4907]: I1129 15:39:35.102184 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdr9r" event={"ID":"442c332f-badd-4aee-8941-cb0e917aeb78","Type":"ContainerDied","Data":"4102c347eaba6941dc1cfbfe2fcb835da24891bf7a76abfb599285f0f0c11e17"} Nov 29 15:39:35 crc kubenswrapper[4907]: I1129 15:39:35.102251 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdr9r" Nov 29 15:39:35 crc kubenswrapper[4907]: I1129 15:39:35.102641 4907 scope.go:117] "RemoveContainer" containerID="b7e63487c0f0e3cc5b7a6b5000d1be88f9f1be0d2307f8d48843989cfad26b14" Nov 29 15:39:35 crc kubenswrapper[4907]: I1129 15:39:35.140370 4907 scope.go:117] "RemoveContainer" containerID="6e3c543cba82f6a2febc666ce3e0de6fcb5e9789a6cfe28eacfaa0a011bc81ae" Nov 29 15:39:35 crc kubenswrapper[4907]: I1129 15:39:35.167264 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kdr9r"] Nov 29 15:39:35 crc kubenswrapper[4907]: I1129 15:39:35.180910 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kdr9r"] Nov 29 15:39:35 crc kubenswrapper[4907]: I1129 15:39:35.184156 4907 scope.go:117] "RemoveContainer" containerID="cb31865da17ae14c721cf321259c84f906283da577cd725f3f457d4d77bacfab" Nov 29 15:39:36 crc kubenswrapper[4907]: I1129 15:39:36.499628 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="442c332f-badd-4aee-8941-cb0e917aeb78" path="/var/lib/kubelet/pods/442c332f-badd-4aee-8941-cb0e917aeb78/volumes" Nov 29 15:39:45 crc kubenswrapper[4907]: E1129 15:39:45.547872 4907 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.47:57838->38.102.83.47:43783: write tcp 38.102.83.47:57838->38.102.83.47:43783: write: broken pipe Nov 29 15:39:58 crc kubenswrapper[4907]: I1129 15:39:58.489841 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:39:58 crc kubenswrapper[4907]: I1129 15:39:58.491388 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:40:11 crc kubenswrapper[4907]: I1129 15:40:11.903050 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rjvhz"] Nov 29 15:40:11 crc kubenswrapper[4907]: E1129 15:40:11.904503 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="442c332f-badd-4aee-8941-cb0e917aeb78" containerName="extract-utilities" Nov 29 15:40:11 crc kubenswrapper[4907]: I1129 15:40:11.904525 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="442c332f-badd-4aee-8941-cb0e917aeb78" containerName="extract-utilities" Nov 29 15:40:11 crc kubenswrapper[4907]: E1129 15:40:11.904551 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="442c332f-badd-4aee-8941-cb0e917aeb78" containerName="registry-server" Nov 29 15:40:11 crc kubenswrapper[4907]: I1129 15:40:11.904567 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="442c332f-badd-4aee-8941-cb0e917aeb78" containerName="registry-server" Nov 29 15:40:11 crc kubenswrapper[4907]: E1129 15:40:11.904629 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="442c332f-badd-4aee-8941-cb0e917aeb78" containerName="extract-content" Nov 29 15:40:11 crc kubenswrapper[4907]: I1129 15:40:11.904642 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="442c332f-badd-4aee-8941-cb0e917aeb78" containerName="extract-content" Nov 29 15:40:11 crc kubenswrapper[4907]: I1129 15:40:11.905045 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="442c332f-badd-4aee-8941-cb0e917aeb78" containerName="registry-server" Nov 29 15:40:11 crc kubenswrapper[4907]: I1129 15:40:11.908022 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:11 crc kubenswrapper[4907]: I1129 15:40:11.921306 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rjvhz"] Nov 29 15:40:12 crc kubenswrapper[4907]: I1129 15:40:12.057971 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4lvd\" (UniqueName: \"kubernetes.io/projected/02bc3496-71eb-4937-860e-c6f3b03a8c41-kube-api-access-k4lvd\") pod \"certified-operators-rjvhz\" (UID: \"02bc3496-71eb-4937-860e-c6f3b03a8c41\") " pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:12 crc kubenswrapper[4907]: I1129 15:40:12.058038 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02bc3496-71eb-4937-860e-c6f3b03a8c41-utilities\") pod \"certified-operators-rjvhz\" (UID: \"02bc3496-71eb-4937-860e-c6f3b03a8c41\") " pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:12 crc kubenswrapper[4907]: I1129 15:40:12.058270 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02bc3496-71eb-4937-860e-c6f3b03a8c41-catalog-content\") pod \"certified-operators-rjvhz\" (UID: \"02bc3496-71eb-4937-860e-c6f3b03a8c41\") " pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:12 crc kubenswrapper[4907]: I1129 15:40:12.160666 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02bc3496-71eb-4937-860e-c6f3b03a8c41-catalog-content\") pod \"certified-operators-rjvhz\" (UID: \"02bc3496-71eb-4937-860e-c6f3b03a8c41\") " pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:12 crc kubenswrapper[4907]: I1129 15:40:12.160888 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4lvd\" (UniqueName: \"kubernetes.io/projected/02bc3496-71eb-4937-860e-c6f3b03a8c41-kube-api-access-k4lvd\") pod \"certified-operators-rjvhz\" (UID: \"02bc3496-71eb-4937-860e-c6f3b03a8c41\") " pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:12 crc kubenswrapper[4907]: I1129 15:40:12.160935 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02bc3496-71eb-4937-860e-c6f3b03a8c41-utilities\") pod \"certified-operators-rjvhz\" (UID: \"02bc3496-71eb-4937-860e-c6f3b03a8c41\") " pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:12 crc kubenswrapper[4907]: I1129 15:40:12.161329 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02bc3496-71eb-4937-860e-c6f3b03a8c41-utilities\") pod \"certified-operators-rjvhz\" (UID: \"02bc3496-71eb-4937-860e-c6f3b03a8c41\") " pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:12 crc kubenswrapper[4907]: I1129 15:40:12.161480 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02bc3496-71eb-4937-860e-c6f3b03a8c41-catalog-content\") pod \"certified-operators-rjvhz\" (UID: \"02bc3496-71eb-4937-860e-c6f3b03a8c41\") " pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:12 crc kubenswrapper[4907]: I1129 15:40:12.640277 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4lvd\" (UniqueName: \"kubernetes.io/projected/02bc3496-71eb-4937-860e-c6f3b03a8c41-kube-api-access-k4lvd\") pod \"certified-operators-rjvhz\" (UID: \"02bc3496-71eb-4937-860e-c6f3b03a8c41\") " pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:12 crc kubenswrapper[4907]: I1129 15:40:12.844894 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:13 crc kubenswrapper[4907]: I1129 15:40:13.371655 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rjvhz"] Nov 29 15:40:13 crc kubenswrapper[4907]: I1129 15:40:13.712638 4907 generic.go:334] "Generic (PLEG): container finished" podID="02bc3496-71eb-4937-860e-c6f3b03a8c41" containerID="fbf62cf6968e18648a80f3a9fa8389ad10a464bdd61bd4f5c9b0b91ce9b651ea" exitCode=0 Nov 29 15:40:13 crc kubenswrapper[4907]: I1129 15:40:13.712726 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjvhz" event={"ID":"02bc3496-71eb-4937-860e-c6f3b03a8c41","Type":"ContainerDied","Data":"fbf62cf6968e18648a80f3a9fa8389ad10a464bdd61bd4f5c9b0b91ce9b651ea"} Nov 29 15:40:13 crc kubenswrapper[4907]: I1129 15:40:13.713097 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjvhz" event={"ID":"02bc3496-71eb-4937-860e-c6f3b03a8c41","Type":"ContainerStarted","Data":"2097113c8e97f65a0b2bd925d9f8b45f6d9800415645d3a927b0084e479dd7b7"} Nov 29 15:40:14 crc kubenswrapper[4907]: I1129 15:40:14.729187 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjvhz" event={"ID":"02bc3496-71eb-4937-860e-c6f3b03a8c41","Type":"ContainerStarted","Data":"710f31602be33d7f4403f3e6ac11dd63e413a6b3150b428f2e838a0b762e313e"} Nov 29 15:40:15 crc kubenswrapper[4907]: I1129 15:40:15.752492 4907 generic.go:334] "Generic (PLEG): container finished" podID="02bc3496-71eb-4937-860e-c6f3b03a8c41" containerID="710f31602be33d7f4403f3e6ac11dd63e413a6b3150b428f2e838a0b762e313e" exitCode=0 Nov 29 15:40:15 crc kubenswrapper[4907]: I1129 15:40:15.752978 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjvhz" event={"ID":"02bc3496-71eb-4937-860e-c6f3b03a8c41","Type":"ContainerDied","Data":"710f31602be33d7f4403f3e6ac11dd63e413a6b3150b428f2e838a0b762e313e"} Nov 29 15:40:16 crc kubenswrapper[4907]: I1129 15:40:16.771874 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjvhz" event={"ID":"02bc3496-71eb-4937-860e-c6f3b03a8c41","Type":"ContainerStarted","Data":"b981a4cd9c33a86d67d0d4419afaf711c75940d60fe6a1f95c01583a7a69da9f"} Nov 29 15:40:16 crc kubenswrapper[4907]: I1129 15:40:16.798814 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rjvhz" podStartSLOduration=3.10843884 podStartE2EDuration="5.798796535s" podCreationTimestamp="2025-11-29 15:40:11 +0000 UTC" firstStartedPulling="2025-11-29 15:40:13.715478016 +0000 UTC m=+4311.702315708" lastFinishedPulling="2025-11-29 15:40:16.405835721 +0000 UTC m=+4314.392673403" observedRunningTime="2025-11-29 15:40:16.796752647 +0000 UTC m=+4314.783590309" watchObservedRunningTime="2025-11-29 15:40:16.798796535 +0000 UTC m=+4314.785634187" Nov 29 15:40:22 crc kubenswrapper[4907]: I1129 15:40:22.845423 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:22 crc kubenswrapper[4907]: I1129 15:40:22.846103 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:22 crc kubenswrapper[4907]: I1129 15:40:22.915956 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:24 crc kubenswrapper[4907]: I1129 15:40:24.429745 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:24 crc kubenswrapper[4907]: I1129 15:40:24.528658 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rjvhz"] Nov 29 15:40:25 crc kubenswrapper[4907]: I1129 15:40:25.872390 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rjvhz" podUID="02bc3496-71eb-4937-860e-c6f3b03a8c41" containerName="registry-server" containerID="cri-o://b981a4cd9c33a86d67d0d4419afaf711c75940d60fe6a1f95c01583a7a69da9f" gracePeriod=2 Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.443327 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.513270 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02bc3496-71eb-4937-860e-c6f3b03a8c41-catalog-content\") pod \"02bc3496-71eb-4937-860e-c6f3b03a8c41\" (UID: \"02bc3496-71eb-4937-860e-c6f3b03a8c41\") " Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.513429 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4lvd\" (UniqueName: \"kubernetes.io/projected/02bc3496-71eb-4937-860e-c6f3b03a8c41-kube-api-access-k4lvd\") pod \"02bc3496-71eb-4937-860e-c6f3b03a8c41\" (UID: \"02bc3496-71eb-4937-860e-c6f3b03a8c41\") " Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.513554 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02bc3496-71eb-4937-860e-c6f3b03a8c41-utilities\") pod \"02bc3496-71eb-4937-860e-c6f3b03a8c41\" (UID: \"02bc3496-71eb-4937-860e-c6f3b03a8c41\") " Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.518041 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02bc3496-71eb-4937-860e-c6f3b03a8c41-utilities" (OuterVolumeSpecName: "utilities") pod "02bc3496-71eb-4937-860e-c6f3b03a8c41" (UID: "02bc3496-71eb-4937-860e-c6f3b03a8c41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.527713 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02bc3496-71eb-4937-860e-c6f3b03a8c41-kube-api-access-k4lvd" (OuterVolumeSpecName: "kube-api-access-k4lvd") pod "02bc3496-71eb-4937-860e-c6f3b03a8c41" (UID: "02bc3496-71eb-4937-860e-c6f3b03a8c41"). InnerVolumeSpecName "kube-api-access-k4lvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.559862 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02bc3496-71eb-4937-860e-c6f3b03a8c41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02bc3496-71eb-4937-860e-c6f3b03a8c41" (UID: "02bc3496-71eb-4937-860e-c6f3b03a8c41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.616772 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02bc3496-71eb-4937-860e-c6f3b03a8c41-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.616812 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02bc3496-71eb-4937-860e-c6f3b03a8c41-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.616827 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4lvd\" (UniqueName: \"kubernetes.io/projected/02bc3496-71eb-4937-860e-c6f3b03a8c41-kube-api-access-k4lvd\") on node \"crc\" DevicePath \"\"" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.886023 4907 generic.go:334] "Generic (PLEG): container finished" podID="02bc3496-71eb-4937-860e-c6f3b03a8c41" containerID="b981a4cd9c33a86d67d0d4419afaf711c75940d60fe6a1f95c01583a7a69da9f" exitCode=0 Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.886103 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjvhz" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.886231 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjvhz" event={"ID":"02bc3496-71eb-4937-860e-c6f3b03a8c41","Type":"ContainerDied","Data":"b981a4cd9c33a86d67d0d4419afaf711c75940d60fe6a1f95c01583a7a69da9f"} Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.887898 4907 scope.go:117] "RemoveContainer" containerID="b981a4cd9c33a86d67d0d4419afaf711c75940d60fe6a1f95c01583a7a69da9f" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.887776 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjvhz" event={"ID":"02bc3496-71eb-4937-860e-c6f3b03a8c41","Type":"ContainerDied","Data":"2097113c8e97f65a0b2bd925d9f8b45f6d9800415645d3a927b0084e479dd7b7"} Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.909860 4907 scope.go:117] "RemoveContainer" containerID="710f31602be33d7f4403f3e6ac11dd63e413a6b3150b428f2e838a0b762e313e" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.923664 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rjvhz"] Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.935770 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rjvhz"] Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.953498 4907 scope.go:117] "RemoveContainer" containerID="fbf62cf6968e18648a80f3a9fa8389ad10a464bdd61bd4f5c9b0b91ce9b651ea" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.980610 4907 scope.go:117] "RemoveContainer" containerID="b981a4cd9c33a86d67d0d4419afaf711c75940d60fe6a1f95c01583a7a69da9f" Nov 29 15:40:26 crc kubenswrapper[4907]: E1129 15:40:26.980977 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b981a4cd9c33a86d67d0d4419afaf711c75940d60fe6a1f95c01583a7a69da9f\": container with ID starting with b981a4cd9c33a86d67d0d4419afaf711c75940d60fe6a1f95c01583a7a69da9f not found: ID does not exist" containerID="b981a4cd9c33a86d67d0d4419afaf711c75940d60fe6a1f95c01583a7a69da9f" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.981013 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b981a4cd9c33a86d67d0d4419afaf711c75940d60fe6a1f95c01583a7a69da9f"} err="failed to get container status \"b981a4cd9c33a86d67d0d4419afaf711c75940d60fe6a1f95c01583a7a69da9f\": rpc error: code = NotFound desc = could not find container \"b981a4cd9c33a86d67d0d4419afaf711c75940d60fe6a1f95c01583a7a69da9f\": container with ID starting with b981a4cd9c33a86d67d0d4419afaf711c75940d60fe6a1f95c01583a7a69da9f not found: ID does not exist" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.981034 4907 scope.go:117] "RemoveContainer" containerID="710f31602be33d7f4403f3e6ac11dd63e413a6b3150b428f2e838a0b762e313e" Nov 29 15:40:26 crc kubenswrapper[4907]: E1129 15:40:26.981701 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"710f31602be33d7f4403f3e6ac11dd63e413a6b3150b428f2e838a0b762e313e\": container with ID starting with 710f31602be33d7f4403f3e6ac11dd63e413a6b3150b428f2e838a0b762e313e not found: ID does not exist" containerID="710f31602be33d7f4403f3e6ac11dd63e413a6b3150b428f2e838a0b762e313e" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.981726 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"710f31602be33d7f4403f3e6ac11dd63e413a6b3150b428f2e838a0b762e313e"} err="failed to get container status \"710f31602be33d7f4403f3e6ac11dd63e413a6b3150b428f2e838a0b762e313e\": rpc error: code = NotFound desc = could not find container \"710f31602be33d7f4403f3e6ac11dd63e413a6b3150b428f2e838a0b762e313e\": container with ID starting with 710f31602be33d7f4403f3e6ac11dd63e413a6b3150b428f2e838a0b762e313e not found: ID does not exist" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.981740 4907 scope.go:117] "RemoveContainer" containerID="fbf62cf6968e18648a80f3a9fa8389ad10a464bdd61bd4f5c9b0b91ce9b651ea" Nov 29 15:40:26 crc kubenswrapper[4907]: E1129 15:40:26.981934 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbf62cf6968e18648a80f3a9fa8389ad10a464bdd61bd4f5c9b0b91ce9b651ea\": container with ID starting with fbf62cf6968e18648a80f3a9fa8389ad10a464bdd61bd4f5c9b0b91ce9b651ea not found: ID does not exist" containerID="fbf62cf6968e18648a80f3a9fa8389ad10a464bdd61bd4f5c9b0b91ce9b651ea" Nov 29 15:40:26 crc kubenswrapper[4907]: I1129 15:40:26.981951 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbf62cf6968e18648a80f3a9fa8389ad10a464bdd61bd4f5c9b0b91ce9b651ea"} err="failed to get container status \"fbf62cf6968e18648a80f3a9fa8389ad10a464bdd61bd4f5c9b0b91ce9b651ea\": rpc error: code = NotFound desc = could not find container \"fbf62cf6968e18648a80f3a9fa8389ad10a464bdd61bd4f5c9b0b91ce9b651ea\": container with ID starting with fbf62cf6968e18648a80f3a9fa8389ad10a464bdd61bd4f5c9b0b91ce9b651ea not found: ID does not exist" Nov 29 15:40:28 crc kubenswrapper[4907]: I1129 15:40:28.490090 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:40:28 crc kubenswrapper[4907]: I1129 15:40:28.490411 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:40:28 crc kubenswrapper[4907]: I1129 15:40:28.492559 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02bc3496-71eb-4937-860e-c6f3b03a8c41" path="/var/lib/kubelet/pods/02bc3496-71eb-4937-860e-c6f3b03a8c41/volumes" Nov 29 15:40:58 crc kubenswrapper[4907]: I1129 15:40:58.489835 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:40:58 crc kubenswrapper[4907]: I1129 15:40:58.490550 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:40:58 crc kubenswrapper[4907]: I1129 15:40:58.492783 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 15:40:58 crc kubenswrapper[4907]: I1129 15:40:58.493822 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 15:40:58 crc kubenswrapper[4907]: I1129 15:40:58.493890 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" gracePeriod=600 Nov 29 15:40:59 crc kubenswrapper[4907]: I1129 15:40:59.314609 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" exitCode=0 Nov 29 15:40:59 crc kubenswrapper[4907]: I1129 15:40:59.314679 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e"} Nov 29 15:40:59 crc kubenswrapper[4907]: I1129 15:40:59.314873 4907 scope.go:117] "RemoveContainer" containerID="3764b4c9482c7833c66ca1145889329f3e46a38783410ffb48f7726fd6484816" Nov 29 15:40:59 crc kubenswrapper[4907]: E1129 15:40:59.562554 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:41:00 crc kubenswrapper[4907]: I1129 15:41:00.337279 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:41:00 crc kubenswrapper[4907]: E1129 15:41:00.338307 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:41:13 crc kubenswrapper[4907]: I1129 15:41:13.480950 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:41:13 crc kubenswrapper[4907]: E1129 15:41:13.483714 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.567280 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bnhk4"] Nov 29 15:41:19 crc kubenswrapper[4907]: E1129 15:41:19.568259 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02bc3496-71eb-4937-860e-c6f3b03a8c41" containerName="extract-utilities" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.568273 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="02bc3496-71eb-4937-860e-c6f3b03a8c41" containerName="extract-utilities" Nov 29 15:41:19 crc kubenswrapper[4907]: E1129 15:41:19.568287 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02bc3496-71eb-4937-860e-c6f3b03a8c41" containerName="extract-content" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.568293 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="02bc3496-71eb-4937-860e-c6f3b03a8c41" containerName="extract-content" Nov 29 15:41:19 crc kubenswrapper[4907]: E1129 15:41:19.568334 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02bc3496-71eb-4937-860e-c6f3b03a8c41" containerName="registry-server" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.568341 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="02bc3496-71eb-4937-860e-c6f3b03a8c41" containerName="registry-server" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.569241 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="02bc3496-71eb-4937-860e-c6f3b03a8c41" containerName="registry-server" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.571020 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.602995 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnhk4"] Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.714031 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed43524-2a49-4944-8be0-ec6ba10d8b01-utilities\") pod \"redhat-marketplace-bnhk4\" (UID: \"3ed43524-2a49-4944-8be0-ec6ba10d8b01\") " pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.714291 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed43524-2a49-4944-8be0-ec6ba10d8b01-catalog-content\") pod \"redhat-marketplace-bnhk4\" (UID: \"3ed43524-2a49-4944-8be0-ec6ba10d8b01\") " pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.714350 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8jlk\" (UniqueName: \"kubernetes.io/projected/3ed43524-2a49-4944-8be0-ec6ba10d8b01-kube-api-access-g8jlk\") pod \"redhat-marketplace-bnhk4\" (UID: \"3ed43524-2a49-4944-8be0-ec6ba10d8b01\") " pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.815929 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed43524-2a49-4944-8be0-ec6ba10d8b01-utilities\") pod \"redhat-marketplace-bnhk4\" (UID: \"3ed43524-2a49-4944-8be0-ec6ba10d8b01\") " pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.816092 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed43524-2a49-4944-8be0-ec6ba10d8b01-catalog-content\") pod \"redhat-marketplace-bnhk4\" (UID: \"3ed43524-2a49-4944-8be0-ec6ba10d8b01\") " pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.816130 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8jlk\" (UniqueName: \"kubernetes.io/projected/3ed43524-2a49-4944-8be0-ec6ba10d8b01-kube-api-access-g8jlk\") pod \"redhat-marketplace-bnhk4\" (UID: \"3ed43524-2a49-4944-8be0-ec6ba10d8b01\") " pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.816408 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed43524-2a49-4944-8be0-ec6ba10d8b01-utilities\") pod \"redhat-marketplace-bnhk4\" (UID: \"3ed43524-2a49-4944-8be0-ec6ba10d8b01\") " pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.816684 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed43524-2a49-4944-8be0-ec6ba10d8b01-catalog-content\") pod \"redhat-marketplace-bnhk4\" (UID: \"3ed43524-2a49-4944-8be0-ec6ba10d8b01\") " pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.836889 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8jlk\" (UniqueName: \"kubernetes.io/projected/3ed43524-2a49-4944-8be0-ec6ba10d8b01-kube-api-access-g8jlk\") pod \"redhat-marketplace-bnhk4\" (UID: \"3ed43524-2a49-4944-8be0-ec6ba10d8b01\") " pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:19 crc kubenswrapper[4907]: I1129 15:41:19.927352 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:20 crc kubenswrapper[4907]: I1129 15:41:20.434076 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnhk4"] Nov 29 15:41:20 crc kubenswrapper[4907]: I1129 15:41:20.637725 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnhk4" event={"ID":"3ed43524-2a49-4944-8be0-ec6ba10d8b01","Type":"ContainerStarted","Data":"947d5daa725ba42d490d00dd6bdc9b089b1ce7bc4984cd269070e32efbeec6e4"} Nov 29 15:41:21 crc kubenswrapper[4907]: I1129 15:41:21.656554 4907 generic.go:334] "Generic (PLEG): container finished" podID="3ed43524-2a49-4944-8be0-ec6ba10d8b01" containerID="476a081d2ea6dd5229b590e49ad8971ef6fa4971bffa20fc79547106a578c4f2" exitCode=0 Nov 29 15:41:21 crc kubenswrapper[4907]: I1129 15:41:21.656638 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnhk4" event={"ID":"3ed43524-2a49-4944-8be0-ec6ba10d8b01","Type":"ContainerDied","Data":"476a081d2ea6dd5229b590e49ad8971ef6fa4971bffa20fc79547106a578c4f2"} Nov 29 15:41:21 crc kubenswrapper[4907]: I1129 15:41:21.660212 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 15:41:23 crc kubenswrapper[4907]: I1129 15:41:23.689026 4907 generic.go:334] "Generic (PLEG): container finished" podID="3ed43524-2a49-4944-8be0-ec6ba10d8b01" containerID="7102d39d3aaf52751b701f2e1dab8d18db114400b010057918ec88da1e642bf6" exitCode=0 Nov 29 15:41:23 crc kubenswrapper[4907]: I1129 15:41:23.689140 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnhk4" event={"ID":"3ed43524-2a49-4944-8be0-ec6ba10d8b01","Type":"ContainerDied","Data":"7102d39d3aaf52751b701f2e1dab8d18db114400b010057918ec88da1e642bf6"} Nov 29 15:41:24 crc kubenswrapper[4907]: I1129 15:41:24.706208 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnhk4" event={"ID":"3ed43524-2a49-4944-8be0-ec6ba10d8b01","Type":"ContainerStarted","Data":"9e32d0206774e9f987f541c9b36577c388801cd8324975cba34a08313c8849f7"} Nov 29 15:41:24 crc kubenswrapper[4907]: I1129 15:41:24.739121 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bnhk4" podStartSLOduration=3.270816074 podStartE2EDuration="5.73909927s" podCreationTimestamp="2025-11-29 15:41:19 +0000 UTC" firstStartedPulling="2025-11-29 15:41:21.659805015 +0000 UTC m=+4379.646642677" lastFinishedPulling="2025-11-29 15:41:24.128088181 +0000 UTC m=+4382.114925873" observedRunningTime="2025-11-29 15:41:24.72705788 +0000 UTC m=+4382.713895552" watchObservedRunningTime="2025-11-29 15:41:24.73909927 +0000 UTC m=+4382.725936932" Nov 29 15:41:27 crc kubenswrapper[4907]: I1129 15:41:27.480895 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:41:27 crc kubenswrapper[4907]: E1129 15:41:27.481548 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:41:29 crc kubenswrapper[4907]: I1129 15:41:29.927534 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:29 crc kubenswrapper[4907]: I1129 15:41:29.928186 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:30 crc kubenswrapper[4907]: I1129 15:41:30.031001 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:30 crc kubenswrapper[4907]: I1129 15:41:30.890090 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:30 crc kubenswrapper[4907]: I1129 15:41:30.957824 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnhk4"] Nov 29 15:41:32 crc kubenswrapper[4907]: I1129 15:41:32.825429 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bnhk4" podUID="3ed43524-2a49-4944-8be0-ec6ba10d8b01" containerName="registry-server" containerID="cri-o://9e32d0206774e9f987f541c9b36577c388801cd8324975cba34a08313c8849f7" gracePeriod=2 Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.516191 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.715865 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed43524-2a49-4944-8be0-ec6ba10d8b01-catalog-content\") pod \"3ed43524-2a49-4944-8be0-ec6ba10d8b01\" (UID: \"3ed43524-2a49-4944-8be0-ec6ba10d8b01\") " Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.716531 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed43524-2a49-4944-8be0-ec6ba10d8b01-utilities\") pod \"3ed43524-2a49-4944-8be0-ec6ba10d8b01\" (UID: \"3ed43524-2a49-4944-8be0-ec6ba10d8b01\") " Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.716758 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8jlk\" (UniqueName: \"kubernetes.io/projected/3ed43524-2a49-4944-8be0-ec6ba10d8b01-kube-api-access-g8jlk\") pod \"3ed43524-2a49-4944-8be0-ec6ba10d8b01\" (UID: \"3ed43524-2a49-4944-8be0-ec6ba10d8b01\") " Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.721910 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed43524-2a49-4944-8be0-ec6ba10d8b01-utilities" (OuterVolumeSpecName: "utilities") pod "3ed43524-2a49-4944-8be0-ec6ba10d8b01" (UID: "3ed43524-2a49-4944-8be0-ec6ba10d8b01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.729307 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed43524-2a49-4944-8be0-ec6ba10d8b01-kube-api-access-g8jlk" (OuterVolumeSpecName: "kube-api-access-g8jlk") pod "3ed43524-2a49-4944-8be0-ec6ba10d8b01" (UID: "3ed43524-2a49-4944-8be0-ec6ba10d8b01"). InnerVolumeSpecName "kube-api-access-g8jlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.755256 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed43524-2a49-4944-8be0-ec6ba10d8b01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ed43524-2a49-4944-8be0-ec6ba10d8b01" (UID: "3ed43524-2a49-4944-8be0-ec6ba10d8b01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.819459 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8jlk\" (UniqueName: \"kubernetes.io/projected/3ed43524-2a49-4944-8be0-ec6ba10d8b01-kube-api-access-g8jlk\") on node \"crc\" DevicePath \"\"" Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.819496 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed43524-2a49-4944-8be0-ec6ba10d8b01-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.819506 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed43524-2a49-4944-8be0-ec6ba10d8b01-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.843938 4907 generic.go:334] "Generic (PLEG): container finished" podID="3ed43524-2a49-4944-8be0-ec6ba10d8b01" containerID="9e32d0206774e9f987f541c9b36577c388801cd8324975cba34a08313c8849f7" exitCode=0 Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.843985 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnhk4" event={"ID":"3ed43524-2a49-4944-8be0-ec6ba10d8b01","Type":"ContainerDied","Data":"9e32d0206774e9f987f541c9b36577c388801cd8324975cba34a08313c8849f7"} Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.844016 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnhk4" event={"ID":"3ed43524-2a49-4944-8be0-ec6ba10d8b01","Type":"ContainerDied","Data":"947d5daa725ba42d490d00dd6bdc9b089b1ce7bc4984cd269070e32efbeec6e4"} Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.844035 4907 scope.go:117] "RemoveContainer" containerID="9e32d0206774e9f987f541c9b36577c388801cd8324975cba34a08313c8849f7" Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.844203 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnhk4" Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.892783 4907 scope.go:117] "RemoveContainer" containerID="7102d39d3aaf52751b701f2e1dab8d18db114400b010057918ec88da1e642bf6" Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.898680 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnhk4"] Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.915144 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnhk4"] Nov 29 15:41:33 crc kubenswrapper[4907]: I1129 15:41:33.932298 4907 scope.go:117] "RemoveContainer" containerID="476a081d2ea6dd5229b590e49ad8971ef6fa4971bffa20fc79547106a578c4f2" Nov 29 15:41:34 crc kubenswrapper[4907]: I1129 15:41:34.008801 4907 scope.go:117] "RemoveContainer" containerID="9e32d0206774e9f987f541c9b36577c388801cd8324975cba34a08313c8849f7" Nov 29 15:41:34 crc kubenswrapper[4907]: E1129 15:41:34.009249 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e32d0206774e9f987f541c9b36577c388801cd8324975cba34a08313c8849f7\": container with ID starting with 9e32d0206774e9f987f541c9b36577c388801cd8324975cba34a08313c8849f7 not found: ID does not exist" containerID="9e32d0206774e9f987f541c9b36577c388801cd8324975cba34a08313c8849f7" Nov 29 15:41:34 crc kubenswrapper[4907]: I1129 15:41:34.009281 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e32d0206774e9f987f541c9b36577c388801cd8324975cba34a08313c8849f7"} err="failed to get container status \"9e32d0206774e9f987f541c9b36577c388801cd8324975cba34a08313c8849f7\": rpc error: code = NotFound desc = could not find container \"9e32d0206774e9f987f541c9b36577c388801cd8324975cba34a08313c8849f7\": container with ID starting with 9e32d0206774e9f987f541c9b36577c388801cd8324975cba34a08313c8849f7 not found: ID does not exist" Nov 29 15:41:34 crc kubenswrapper[4907]: I1129 15:41:34.009301 4907 scope.go:117] "RemoveContainer" containerID="7102d39d3aaf52751b701f2e1dab8d18db114400b010057918ec88da1e642bf6" Nov 29 15:41:34 crc kubenswrapper[4907]: E1129 15:41:34.010502 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7102d39d3aaf52751b701f2e1dab8d18db114400b010057918ec88da1e642bf6\": container with ID starting with 7102d39d3aaf52751b701f2e1dab8d18db114400b010057918ec88da1e642bf6 not found: ID does not exist" containerID="7102d39d3aaf52751b701f2e1dab8d18db114400b010057918ec88da1e642bf6" Nov 29 15:41:34 crc kubenswrapper[4907]: I1129 15:41:34.010550 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7102d39d3aaf52751b701f2e1dab8d18db114400b010057918ec88da1e642bf6"} err="failed to get container status \"7102d39d3aaf52751b701f2e1dab8d18db114400b010057918ec88da1e642bf6\": rpc error: code = NotFound desc = could not find container \"7102d39d3aaf52751b701f2e1dab8d18db114400b010057918ec88da1e642bf6\": container with ID starting with 7102d39d3aaf52751b701f2e1dab8d18db114400b010057918ec88da1e642bf6 not found: ID does not exist" Nov 29 15:41:34 crc kubenswrapper[4907]: I1129 15:41:34.010578 4907 scope.go:117] "RemoveContainer" containerID="476a081d2ea6dd5229b590e49ad8971ef6fa4971bffa20fc79547106a578c4f2" Nov 29 15:41:34 crc kubenswrapper[4907]: E1129 15:41:34.011131 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476a081d2ea6dd5229b590e49ad8971ef6fa4971bffa20fc79547106a578c4f2\": container with ID starting with 476a081d2ea6dd5229b590e49ad8971ef6fa4971bffa20fc79547106a578c4f2 not found: ID does not exist" containerID="476a081d2ea6dd5229b590e49ad8971ef6fa4971bffa20fc79547106a578c4f2" Nov 29 15:41:34 crc kubenswrapper[4907]: I1129 15:41:34.011158 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476a081d2ea6dd5229b590e49ad8971ef6fa4971bffa20fc79547106a578c4f2"} err="failed to get container status \"476a081d2ea6dd5229b590e49ad8971ef6fa4971bffa20fc79547106a578c4f2\": rpc error: code = NotFound desc = could not find container \"476a081d2ea6dd5229b590e49ad8971ef6fa4971bffa20fc79547106a578c4f2\": container with ID starting with 476a081d2ea6dd5229b590e49ad8971ef6fa4971bffa20fc79547106a578c4f2 not found: ID does not exist" Nov 29 15:41:34 crc kubenswrapper[4907]: I1129 15:41:34.502190 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed43524-2a49-4944-8be0-ec6ba10d8b01" path="/var/lib/kubelet/pods/3ed43524-2a49-4944-8be0-ec6ba10d8b01/volumes" Nov 29 15:41:39 crc kubenswrapper[4907]: I1129 15:41:39.482183 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:41:39 crc kubenswrapper[4907]: E1129 15:41:39.483183 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:41:52 crc kubenswrapper[4907]: I1129 15:41:52.492185 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:41:52 crc kubenswrapper[4907]: E1129 15:41:52.493214 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:42:07 crc kubenswrapper[4907]: I1129 15:42:07.481505 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:42:07 crc kubenswrapper[4907]: E1129 15:42:07.483195 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:42:15 crc kubenswrapper[4907]: I1129 15:42:15.753941 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="2b4712d7-a81b-455f-841a-a0ca14eafcbe" containerName="galera" probeResult="failure" output="command timed out" Nov 29 15:42:15 crc kubenswrapper[4907]: I1129 15:42:15.771670 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="2b4712d7-a81b-455f-841a-a0ca14eafcbe" containerName="galera" probeResult="failure" output="command timed out" Nov 29 15:42:21 crc kubenswrapper[4907]: I1129 15:42:21.480212 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:42:21 crc kubenswrapper[4907]: E1129 15:42:21.481019 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:42:36 crc kubenswrapper[4907]: I1129 15:42:36.481034 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:42:36 crc kubenswrapper[4907]: E1129 15:42:36.482642 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:42:51 crc kubenswrapper[4907]: I1129 15:42:51.480861 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:42:51 crc kubenswrapper[4907]: E1129 15:42:51.482279 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:43:03 crc kubenswrapper[4907]: I1129 15:43:03.482149 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:43:03 crc kubenswrapper[4907]: E1129 15:43:03.483341 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:43:16 crc kubenswrapper[4907]: I1129 15:43:16.488158 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:43:16 crc kubenswrapper[4907]: E1129 15:43:16.489113 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:43:30 crc kubenswrapper[4907]: I1129 15:43:30.484786 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:43:30 crc kubenswrapper[4907]: E1129 15:43:30.487108 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:43:43 crc kubenswrapper[4907]: I1129 15:43:43.479789 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:43:43 crc kubenswrapper[4907]: E1129 15:43:43.480631 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:43:54 crc kubenswrapper[4907]: I1129 15:43:54.479952 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:43:54 crc kubenswrapper[4907]: E1129 15:43:54.480800 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:44:09 crc kubenswrapper[4907]: I1129 15:44:09.479742 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:44:09 crc kubenswrapper[4907]: E1129 15:44:09.480547 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:44:23 crc kubenswrapper[4907]: I1129 15:44:23.480608 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:44:23 crc kubenswrapper[4907]: E1129 15:44:23.481581 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:44:37 crc kubenswrapper[4907]: I1129 15:44:37.480261 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:44:37 crc kubenswrapper[4907]: E1129 15:44:37.481074 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:44:48 crc kubenswrapper[4907]: I1129 15:44:48.480247 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:44:48 crc kubenswrapper[4907]: E1129 15:44:48.481327 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:44:59 crc kubenswrapper[4907]: I1129 15:44:59.479396 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:44:59 crc kubenswrapper[4907]: E1129 15:44:59.480208 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.205193 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq"] Nov 29 15:45:00 crc kubenswrapper[4907]: E1129 15:45:00.205845 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed43524-2a49-4944-8be0-ec6ba10d8b01" containerName="extract-utilities" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.205873 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed43524-2a49-4944-8be0-ec6ba10d8b01" containerName="extract-utilities" Nov 29 15:45:00 crc kubenswrapper[4907]: E1129 15:45:00.205886 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed43524-2a49-4944-8be0-ec6ba10d8b01" containerName="registry-server" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.205894 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed43524-2a49-4944-8be0-ec6ba10d8b01" containerName="registry-server" Nov 29 15:45:00 crc kubenswrapper[4907]: E1129 15:45:00.205911 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed43524-2a49-4944-8be0-ec6ba10d8b01" containerName="extract-content" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.205919 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed43524-2a49-4944-8be0-ec6ba10d8b01" containerName="extract-content" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.206191 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed43524-2a49-4944-8be0-ec6ba10d8b01" containerName="registry-server" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.207201 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.210090 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.213220 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.227004 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq"] Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.239761 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-config-volume\") pod \"collect-profiles-29407185-pqkwq\" (UID: \"2cd4cc48-ce7d-4b2f-b065-11835d5834eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.239921 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8bsr\" (UniqueName: \"kubernetes.io/projected/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-kube-api-access-x8bsr\") pod \"collect-profiles-29407185-pqkwq\" (UID: \"2cd4cc48-ce7d-4b2f-b065-11835d5834eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.240421 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-secret-volume\") pod \"collect-profiles-29407185-pqkwq\" (UID: \"2cd4cc48-ce7d-4b2f-b065-11835d5834eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.342738 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-secret-volume\") pod \"collect-profiles-29407185-pqkwq\" (UID: \"2cd4cc48-ce7d-4b2f-b065-11835d5834eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.343183 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-config-volume\") pod \"collect-profiles-29407185-pqkwq\" (UID: \"2cd4cc48-ce7d-4b2f-b065-11835d5834eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.343238 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8bsr\" (UniqueName: \"kubernetes.io/projected/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-kube-api-access-x8bsr\") pod \"collect-profiles-29407185-pqkwq\" (UID: \"2cd4cc48-ce7d-4b2f-b065-11835d5834eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.343980 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-config-volume\") pod \"collect-profiles-29407185-pqkwq\" (UID: \"2cd4cc48-ce7d-4b2f-b065-11835d5834eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.350797 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-secret-volume\") pod \"collect-profiles-29407185-pqkwq\" (UID: \"2cd4cc48-ce7d-4b2f-b065-11835d5834eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.375180 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8bsr\" (UniqueName: \"kubernetes.io/projected/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-kube-api-access-x8bsr\") pod \"collect-profiles-29407185-pqkwq\" (UID: \"2cd4cc48-ce7d-4b2f-b065-11835d5834eb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" Nov 29 15:45:00 crc kubenswrapper[4907]: I1129 15:45:00.539642 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" Nov 29 15:45:01 crc kubenswrapper[4907]: I1129 15:45:01.011006 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq"] Nov 29 15:45:01 crc kubenswrapper[4907]: I1129 15:45:01.724362 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" event={"ID":"2cd4cc48-ce7d-4b2f-b065-11835d5834eb","Type":"ContainerStarted","Data":"ce504aa38088da8a1bef3326bbe65f6cbcf4528246ae2a6d7532cab504cb9381"} Nov 29 15:45:01 crc kubenswrapper[4907]: I1129 15:45:01.724737 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" event={"ID":"2cd4cc48-ce7d-4b2f-b065-11835d5834eb","Type":"ContainerStarted","Data":"b26ffff5f62ea8639f8110b54bb4fa0675b3a494988fd2cdb0a320cc842bdd42"} Nov 29 15:45:01 crc kubenswrapper[4907]: I1129 15:45:01.761662 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" podStartSLOduration=1.761643541 podStartE2EDuration="1.761643541s" podCreationTimestamp="2025-11-29 15:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 15:45:01.750567319 +0000 UTC m=+4599.737405001" watchObservedRunningTime="2025-11-29 15:45:01.761643541 +0000 UTC m=+4599.748481193" Nov 29 15:45:02 crc kubenswrapper[4907]: I1129 15:45:02.746860 4907 generic.go:334] "Generic (PLEG): container finished" podID="2cd4cc48-ce7d-4b2f-b065-11835d5834eb" containerID="ce504aa38088da8a1bef3326bbe65f6cbcf4528246ae2a6d7532cab504cb9381" exitCode=0 Nov 29 15:45:02 crc kubenswrapper[4907]: I1129 15:45:02.746947 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" event={"ID":"2cd4cc48-ce7d-4b2f-b065-11835d5834eb","Type":"ContainerDied","Data":"ce504aa38088da8a1bef3326bbe65f6cbcf4528246ae2a6d7532cab504cb9381"} Nov 29 15:45:04 crc kubenswrapper[4907]: I1129 15:45:04.226923 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" Nov 29 15:45:04 crc kubenswrapper[4907]: I1129 15:45:04.337773 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-config-volume\") pod \"2cd4cc48-ce7d-4b2f-b065-11835d5834eb\" (UID: \"2cd4cc48-ce7d-4b2f-b065-11835d5834eb\") " Nov 29 15:45:04 crc kubenswrapper[4907]: I1129 15:45:04.337927 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-secret-volume\") pod \"2cd4cc48-ce7d-4b2f-b065-11835d5834eb\" (UID: \"2cd4cc48-ce7d-4b2f-b065-11835d5834eb\") " Nov 29 15:45:04 crc kubenswrapper[4907]: I1129 15:45:04.337965 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8bsr\" (UniqueName: \"kubernetes.io/projected/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-kube-api-access-x8bsr\") pod \"2cd4cc48-ce7d-4b2f-b065-11835d5834eb\" (UID: \"2cd4cc48-ce7d-4b2f-b065-11835d5834eb\") " Nov 29 15:45:04 crc kubenswrapper[4907]: I1129 15:45:04.339811 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-config-volume" (OuterVolumeSpecName: "config-volume") pod "2cd4cc48-ce7d-4b2f-b065-11835d5834eb" (UID: "2cd4cc48-ce7d-4b2f-b065-11835d5834eb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 15:45:04 crc kubenswrapper[4907]: I1129 15:45:04.345723 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2cd4cc48-ce7d-4b2f-b065-11835d5834eb" (UID: "2cd4cc48-ce7d-4b2f-b065-11835d5834eb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 15:45:04 crc kubenswrapper[4907]: I1129 15:45:04.356162 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-kube-api-access-x8bsr" (OuterVolumeSpecName: "kube-api-access-x8bsr") pod "2cd4cc48-ce7d-4b2f-b065-11835d5834eb" (UID: "2cd4cc48-ce7d-4b2f-b065-11835d5834eb"). InnerVolumeSpecName "kube-api-access-x8bsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:45:04 crc kubenswrapper[4907]: I1129 15:45:04.440579 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 15:45:04 crc kubenswrapper[4907]: I1129 15:45:04.440623 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 15:45:04 crc kubenswrapper[4907]: I1129 15:45:04.440638 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8bsr\" (UniqueName: \"kubernetes.io/projected/2cd4cc48-ce7d-4b2f-b065-11835d5834eb-kube-api-access-x8bsr\") on node \"crc\" DevicePath \"\"" Nov 29 15:45:04 crc kubenswrapper[4907]: I1129 15:45:04.768892 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" event={"ID":"2cd4cc48-ce7d-4b2f-b065-11835d5834eb","Type":"ContainerDied","Data":"b26ffff5f62ea8639f8110b54bb4fa0675b3a494988fd2cdb0a320cc842bdd42"} Nov 29 15:45:04 crc kubenswrapper[4907]: I1129 15:45:04.769408 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b26ffff5f62ea8639f8110b54bb4fa0675b3a494988fd2cdb0a320cc842bdd42" Nov 29 15:45:04 crc kubenswrapper[4907]: I1129 15:45:04.768952 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407185-pqkwq" Nov 29 15:45:04 crc kubenswrapper[4907]: I1129 15:45:04.862532 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m"] Nov 29 15:45:04 crc kubenswrapper[4907]: I1129 15:45:04.874385 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407140-2gq7m"] Nov 29 15:45:06 crc kubenswrapper[4907]: I1129 15:45:06.492847 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba0e8a79-8f7d-4441-85e7-616d33386673" path="/var/lib/kubelet/pods/ba0e8a79-8f7d-4441-85e7-616d33386673/volumes" Nov 29 15:45:13 crc kubenswrapper[4907]: I1129 15:45:13.479547 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:45:13 crc kubenswrapper[4907]: E1129 15:45:13.480475 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:45:28 crc kubenswrapper[4907]: I1129 15:45:28.483423 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:45:28 crc kubenswrapper[4907]: E1129 15:45:28.484323 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:45:30 crc kubenswrapper[4907]: I1129 15:45:30.621038 4907 scope.go:117] "RemoveContainer" containerID="77cb45b089dac74176bde7b90f34c9f927c06ea36d48eeac2820c23bcd119795" Nov 29 15:45:41 crc kubenswrapper[4907]: I1129 15:45:41.480518 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:45:41 crc kubenswrapper[4907]: E1129 15:45:41.481354 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:45:53 crc kubenswrapper[4907]: I1129 15:45:53.480017 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:45:53 crc kubenswrapper[4907]: E1129 15:45:53.483821 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:46:04 crc kubenswrapper[4907]: I1129 15:46:04.480339 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:46:05 crc kubenswrapper[4907]: I1129 15:46:05.572990 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"6e24dcbb8c9e438c67ce623147f1bd4b3d9b6b2886bc5d2a86b53d4411cf22f8"} Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.437333 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Nov 29 15:48:00 crc kubenswrapper[4907]: E1129 15:48:00.438782 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd4cc48-ce7d-4b2f-b065-11835d5834eb" containerName="collect-profiles" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.438808 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd4cc48-ce7d-4b2f-b065-11835d5834eb" containerName="collect-profiles" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.439260 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd4cc48-ce7d-4b2f-b065-11835d5834eb" containerName="collect-profiles" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.440661 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.447104 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.447521 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gwjbm" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.447693 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.447843 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.456838 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.601827 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.601939 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e689bb5a-7b28-48c6-995f-bc0dc07078de-config-data\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.601984 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e689bb5a-7b28-48c6-995f-bc0dc07078de-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.602050 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92s7r\" (UniqueName: \"kubernetes.io/projected/e689bb5a-7b28-48c6-995f-bc0dc07078de-kube-api-access-92s7r\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.602082 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.602170 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e689bb5a-7b28-48c6-995f-bc0dc07078de-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.602222 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.602252 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e689bb5a-7b28-48c6-995f-bc0dc07078de-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.602305 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.704598 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.704720 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e689bb5a-7b28-48c6-995f-bc0dc07078de-config-data\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.704769 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e689bb5a-7b28-48c6-995f-bc0dc07078de-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.704843 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92s7r\" (UniqueName: \"kubernetes.io/projected/e689bb5a-7b28-48c6-995f-bc0dc07078de-kube-api-access-92s7r\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.704898 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.704939 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.705343 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e689bb5a-7b28-48c6-995f-bc0dc07078de-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.705358 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e689bb5a-7b28-48c6-995f-bc0dc07078de-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.705568 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.705633 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e689bb5a-7b28-48c6-995f-bc0dc07078de-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.705728 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.706276 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e689bb5a-7b28-48c6-995f-bc0dc07078de-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.706272 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e689bb5a-7b28-48c6-995f-bc0dc07078de-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.706968 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e689bb5a-7b28-48c6-995f-bc0dc07078de-config-data\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.713133 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.713946 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.723688 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.728133 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92s7r\" (UniqueName: \"kubernetes.io/projected/e689bb5a-7b28-48c6-995f-bc0dc07078de-kube-api-access-92s7r\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.758273 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " pod="openstack/tempest-tests-tempest" Nov 29 15:48:00 crc kubenswrapper[4907]: I1129 15:48:00.780226 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 29 15:48:01 crc kubenswrapper[4907]: I1129 15:48:01.319373 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 29 15:48:01 crc kubenswrapper[4907]: I1129 15:48:01.334847 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 15:48:02 crc kubenswrapper[4907]: I1129 15:48:02.224594 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e689bb5a-7b28-48c6-995f-bc0dc07078de","Type":"ContainerStarted","Data":"b548c1ce3c7dbe990310154a9407db9e1796867b63905ac16605c2ec24aaf538"} Nov 29 15:48:28 crc kubenswrapper[4907]: I1129 15:48:28.489668 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:48:28 crc kubenswrapper[4907]: I1129 15:48:28.490148 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:48:40 crc kubenswrapper[4907]: E1129 15:48:40.947252 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Nov 29 15:48:40 crc kubenswrapper[4907]: E1129 15:48:40.949545 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-92s7r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(e689bb5a-7b28-48c6-995f-bc0dc07078de): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 15:48:40 crc kubenswrapper[4907]: E1129 15:48:40.950843 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="e689bb5a-7b28-48c6-995f-bc0dc07078de" Nov 29 15:48:41 crc kubenswrapper[4907]: E1129 15:48:41.745927 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="e689bb5a-7b28-48c6-995f-bc0dc07078de" Nov 29 15:48:55 crc kubenswrapper[4907]: I1129 15:48:55.256012 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 29 15:48:56 crc kubenswrapper[4907]: I1129 15:48:56.969044 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e689bb5a-7b28-48c6-995f-bc0dc07078de","Type":"ContainerStarted","Data":"da80da5f18935b57f85bdec8a5c210ee17e62bfc05d60865d2ef53d76541deef"} Nov 29 15:48:56 crc kubenswrapper[4907]: I1129 15:48:56.998222 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.079945585 podStartE2EDuration="57.998196387s" podCreationTimestamp="2025-11-29 15:47:59 +0000 UTC" firstStartedPulling="2025-11-29 15:48:01.334638632 +0000 UTC m=+4779.321476274" lastFinishedPulling="2025-11-29 15:48:55.252889404 +0000 UTC m=+4833.239727076" observedRunningTime="2025-11-29 15:48:56.991847618 +0000 UTC m=+4834.978685280" watchObservedRunningTime="2025-11-29 15:48:56.998196387 +0000 UTC m=+4834.985034049" Nov 29 15:48:58 crc kubenswrapper[4907]: I1129 15:48:58.490214 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:48:58 crc kubenswrapper[4907]: I1129 15:48:58.490521 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:49:28 crc kubenswrapper[4907]: I1129 15:49:28.489882 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:49:28 crc kubenswrapper[4907]: I1129 15:49:28.490494 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:49:28 crc kubenswrapper[4907]: I1129 15:49:28.493262 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 15:49:28 crc kubenswrapper[4907]: I1129 15:49:28.494224 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6e24dcbb8c9e438c67ce623147f1bd4b3d9b6b2886bc5d2a86b53d4411cf22f8"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 15:49:28 crc kubenswrapper[4907]: I1129 15:49:28.494296 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://6e24dcbb8c9e438c67ce623147f1bd4b3d9b6b2886bc5d2a86b53d4411cf22f8" gracePeriod=600 Nov 29 15:49:29 crc kubenswrapper[4907]: I1129 15:49:29.367814 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="6e24dcbb8c9e438c67ce623147f1bd4b3d9b6b2886bc5d2a86b53d4411cf22f8" exitCode=0 Nov 29 15:49:29 crc kubenswrapper[4907]: I1129 15:49:29.367904 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"6e24dcbb8c9e438c67ce623147f1bd4b3d9b6b2886bc5d2a86b53d4411cf22f8"} Nov 29 15:49:29 crc kubenswrapper[4907]: I1129 15:49:29.368473 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317"} Nov 29 15:49:29 crc kubenswrapper[4907]: I1129 15:49:29.368505 4907 scope.go:117] "RemoveContainer" containerID="989a4abf7e0526be8e833fda0f83ec419f1a636524f68b19ba5c897c7468125e" Nov 29 15:50:18 crc kubenswrapper[4907]: I1129 15:50:18.741602 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rxvjq"] Nov 29 15:50:18 crc kubenswrapper[4907]: I1129 15:50:18.745112 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxvjq" Nov 29 15:50:18 crc kubenswrapper[4907]: I1129 15:50:18.765832 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rxvjq"] Nov 29 15:50:18 crc kubenswrapper[4907]: I1129 15:50:18.827332 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2b263a-b49b-4b7e-bcd7-f17b707db54a-utilities\") pod \"community-operators-rxvjq\" (UID: \"aa2b263a-b49b-4b7e-bcd7-f17b707db54a\") " pod="openshift-marketplace/community-operators-rxvjq" Nov 29 15:50:18 crc kubenswrapper[4907]: I1129 15:50:18.827565 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2b263a-b49b-4b7e-bcd7-f17b707db54a-catalog-content\") pod \"community-operators-rxvjq\" (UID: \"aa2b263a-b49b-4b7e-bcd7-f17b707db54a\") " pod="openshift-marketplace/community-operators-rxvjq" Nov 29 15:50:18 crc kubenswrapper[4907]: I1129 15:50:18.827610 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhtwd\" (UniqueName: \"kubernetes.io/projected/aa2b263a-b49b-4b7e-bcd7-f17b707db54a-kube-api-access-hhtwd\") pod \"community-operators-rxvjq\" (UID: \"aa2b263a-b49b-4b7e-bcd7-f17b707db54a\") " pod="openshift-marketplace/community-operators-rxvjq" Nov 29 15:50:18 crc kubenswrapper[4907]: I1129 15:50:18.929872 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2b263a-b49b-4b7e-bcd7-f17b707db54a-catalog-content\") pod \"community-operators-rxvjq\" (UID: \"aa2b263a-b49b-4b7e-bcd7-f17b707db54a\") " pod="openshift-marketplace/community-operators-rxvjq" Nov 29 15:50:18 crc kubenswrapper[4907]: I1129 15:50:18.929944 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhtwd\" (UniqueName: \"kubernetes.io/projected/aa2b263a-b49b-4b7e-bcd7-f17b707db54a-kube-api-access-hhtwd\") pod \"community-operators-rxvjq\" (UID: \"aa2b263a-b49b-4b7e-bcd7-f17b707db54a\") " pod="openshift-marketplace/community-operators-rxvjq" Nov 29 15:50:18 crc kubenswrapper[4907]: I1129 15:50:18.930100 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2b263a-b49b-4b7e-bcd7-f17b707db54a-utilities\") pod \"community-operators-rxvjq\" (UID: \"aa2b263a-b49b-4b7e-bcd7-f17b707db54a\") " pod="openshift-marketplace/community-operators-rxvjq" Nov 29 15:50:18 crc kubenswrapper[4907]: I1129 15:50:18.932087 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2b263a-b49b-4b7e-bcd7-f17b707db54a-utilities\") pod \"community-operators-rxvjq\" (UID: \"aa2b263a-b49b-4b7e-bcd7-f17b707db54a\") " pod="openshift-marketplace/community-operators-rxvjq" Nov 29 15:50:18 crc kubenswrapper[4907]: I1129 15:50:18.933090 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2b263a-b49b-4b7e-bcd7-f17b707db54a-catalog-content\") pod \"community-operators-rxvjq\" (UID: \"aa2b263a-b49b-4b7e-bcd7-f17b707db54a\") " pod="openshift-marketplace/community-operators-rxvjq" Nov 29 15:50:19 crc kubenswrapper[4907]: I1129 15:50:19.244527 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhtwd\" (UniqueName: \"kubernetes.io/projected/aa2b263a-b49b-4b7e-bcd7-f17b707db54a-kube-api-access-hhtwd\") pod \"community-operators-rxvjq\" (UID: \"aa2b263a-b49b-4b7e-bcd7-f17b707db54a\") " pod="openshift-marketplace/community-operators-rxvjq" Nov 29 15:50:19 crc kubenswrapper[4907]: I1129 15:50:19.367548 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxvjq" Nov 29 15:50:20 crc kubenswrapper[4907]: I1129 15:50:20.089024 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rxvjq"] Nov 29 15:50:20 crc kubenswrapper[4907]: I1129 15:50:20.981781 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxvjq" event={"ID":"aa2b263a-b49b-4b7e-bcd7-f17b707db54a","Type":"ContainerDied","Data":"610c50ae30c8cbdde5564d57876d09b0f4623f015b5445fbb026d9878a28033f"} Nov 29 15:50:20 crc kubenswrapper[4907]: I1129 15:50:20.981860 4907 generic.go:334] "Generic (PLEG): container finished" podID="aa2b263a-b49b-4b7e-bcd7-f17b707db54a" containerID="610c50ae30c8cbdde5564d57876d09b0f4623f015b5445fbb026d9878a28033f" exitCode=0 Nov 29 15:50:20 crc kubenswrapper[4907]: I1129 15:50:20.982152 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxvjq" event={"ID":"aa2b263a-b49b-4b7e-bcd7-f17b707db54a","Type":"ContainerStarted","Data":"e976e13ee63064f67cb3d12bfdd500b77944f4f0bcc368041452284e63653a3b"} Nov 29 15:50:28 crc kubenswrapper[4907]: I1129 15:50:28.052585 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxvjq" event={"ID":"aa2b263a-b49b-4b7e-bcd7-f17b707db54a","Type":"ContainerStarted","Data":"e4b92c3f9f94d0916bf337a6baee455d097c77a5f2286b85131c5bc2cccfe784"} Nov 29 15:50:29 crc kubenswrapper[4907]: I1129 15:50:29.066124 4907 generic.go:334] "Generic (PLEG): container finished" podID="aa2b263a-b49b-4b7e-bcd7-f17b707db54a" containerID="e4b92c3f9f94d0916bf337a6baee455d097c77a5f2286b85131c5bc2cccfe784" exitCode=0 Nov 29 15:50:29 crc kubenswrapper[4907]: I1129 15:50:29.066303 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxvjq" event={"ID":"aa2b263a-b49b-4b7e-bcd7-f17b707db54a","Type":"ContainerDied","Data":"e4b92c3f9f94d0916bf337a6baee455d097c77a5f2286b85131c5bc2cccfe784"} Nov 29 15:50:30 crc kubenswrapper[4907]: I1129 15:50:30.079094 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxvjq" event={"ID":"aa2b263a-b49b-4b7e-bcd7-f17b707db54a","Type":"ContainerStarted","Data":"01a602e61f158c54d8bc484f71ef7b8b8c705368b98ea176938c3edf83d40f0d"} Nov 29 15:50:30 crc kubenswrapper[4907]: I1129 15:50:30.093926 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rxvjq" podStartSLOduration=3.538062387 podStartE2EDuration="12.093365454s" podCreationTimestamp="2025-11-29 15:50:18 +0000 UTC" firstStartedPulling="2025-11-29 15:50:20.983966182 +0000 UTC m=+4918.970803834" lastFinishedPulling="2025-11-29 15:50:29.539269209 +0000 UTC m=+4927.526106901" observedRunningTime="2025-11-29 15:50:30.091923223 +0000 UTC m=+4928.078760875" watchObservedRunningTime="2025-11-29 15:50:30.093365454 +0000 UTC m=+4928.080203106" Nov 29 15:50:39 crc kubenswrapper[4907]: I1129 15:50:39.368913 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rxvjq" Nov 29 15:50:39 crc kubenswrapper[4907]: I1129 15:50:39.369414 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rxvjq" Nov 29 15:50:39 crc kubenswrapper[4907]: I1129 15:50:39.437143 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rxvjq" Nov 29 15:50:40 crc kubenswrapper[4907]: I1129 15:50:40.684343 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rxvjq" Nov 29 15:50:40 crc kubenswrapper[4907]: I1129 15:50:40.752071 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rxvjq"] Nov 29 15:50:40 crc kubenswrapper[4907]: I1129 15:50:40.800187 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kmjqr"] Nov 29 15:50:40 crc kubenswrapper[4907]: I1129 15:50:40.803320 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kmjqr" podUID="0e75e676-b47f-4633-9011-93f0bbc72b01" containerName="registry-server" containerID="cri-o://a4d3f3bd14cb5f398459a14cdd1eac0eacac6c7530f585639e598d97bf144e43" gracePeriod=2 Nov 29 15:50:41 crc kubenswrapper[4907]: I1129 15:50:41.204070 4907 generic.go:334] "Generic (PLEG): container finished" podID="0e75e676-b47f-4633-9011-93f0bbc72b01" containerID="a4d3f3bd14cb5f398459a14cdd1eac0eacac6c7530f585639e598d97bf144e43" exitCode=0 Nov 29 15:50:41 crc kubenswrapper[4907]: I1129 15:50:41.204165 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kmjqr" event={"ID":"0e75e676-b47f-4633-9011-93f0bbc72b01","Type":"ContainerDied","Data":"a4d3f3bd14cb5f398459a14cdd1eac0eacac6c7530f585639e598d97bf144e43"} Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.095057 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.221363 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kmjqr" Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.221879 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kmjqr" event={"ID":"0e75e676-b47f-4633-9011-93f0bbc72b01","Type":"ContainerDied","Data":"2647b1fa7ba38971979282ad56e1c44a6db24b39751a33b46b361a45a38b203c"} Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.296380 4907 scope.go:117] "RemoveContainer" containerID="a4d3f3bd14cb5f398459a14cdd1eac0eacac6c7530f585639e598d97bf144e43" Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.296534 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gkdc\" (UniqueName: \"kubernetes.io/projected/0e75e676-b47f-4633-9011-93f0bbc72b01-kube-api-access-2gkdc\") pod \"0e75e676-b47f-4633-9011-93f0bbc72b01\" (UID: \"0e75e676-b47f-4633-9011-93f0bbc72b01\") " Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.296833 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e75e676-b47f-4633-9011-93f0bbc72b01-utilities\") pod \"0e75e676-b47f-4633-9011-93f0bbc72b01\" (UID: \"0e75e676-b47f-4633-9011-93f0bbc72b01\") " Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.296971 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e75e676-b47f-4633-9011-93f0bbc72b01-catalog-content\") pod \"0e75e676-b47f-4633-9011-93f0bbc72b01\" (UID: \"0e75e676-b47f-4633-9011-93f0bbc72b01\") " Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.300095 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e75e676-b47f-4633-9011-93f0bbc72b01-utilities" (OuterVolumeSpecName: "utilities") pod "0e75e676-b47f-4633-9011-93f0bbc72b01" (UID: "0e75e676-b47f-4633-9011-93f0bbc72b01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.316921 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e75e676-b47f-4633-9011-93f0bbc72b01-kube-api-access-2gkdc" (OuterVolumeSpecName: "kube-api-access-2gkdc") pod "0e75e676-b47f-4633-9011-93f0bbc72b01" (UID: "0e75e676-b47f-4633-9011-93f0bbc72b01"). InnerVolumeSpecName "kube-api-access-2gkdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.500897 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gkdc\" (UniqueName: \"kubernetes.io/projected/0e75e676-b47f-4633-9011-93f0bbc72b01-kube-api-access-2gkdc\") on node \"crc\" DevicePath \"\"" Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.501508 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e75e676-b47f-4633-9011-93f0bbc72b01-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.527286 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e75e676-b47f-4633-9011-93f0bbc72b01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e75e676-b47f-4633-9011-93f0bbc72b01" (UID: "0e75e676-b47f-4633-9011-93f0bbc72b01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.541553 4907 scope.go:117] "RemoveContainer" containerID="2a55120f778052f1c244d0ce6c0f2368a6b6f51f3fb03deb8bd60c0e8bafc39b" Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.578191 4907 scope.go:117] "RemoveContainer" containerID="6f85752a38cce831f2feb7ff151dc5ce32ed1f521d60ea7fc50b9badb7607a73" Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.604965 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e75e676-b47f-4633-9011-93f0bbc72b01-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.871754 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kmjqr"] Nov 29 15:50:42 crc kubenswrapper[4907]: I1129 15:50:42.872042 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kmjqr"] Nov 29 15:50:44 crc kubenswrapper[4907]: I1129 15:50:44.495952 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e75e676-b47f-4633-9011-93f0bbc72b01" path="/var/lib/kubelet/pods/0e75e676-b47f-4633-9011-93f0bbc72b01/volumes" Nov 29 15:51:28 crc kubenswrapper[4907]: I1129 15:51:28.491004 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:51:28 crc kubenswrapper[4907]: I1129 15:51:28.492103 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:51:58 crc kubenswrapper[4907]: I1129 15:51:58.490764 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:51:58 crc kubenswrapper[4907]: I1129 15:51:58.492111 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.283016 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bdnqp"] Nov 29 15:52:00 crc kubenswrapper[4907]: E1129 15:52:00.287024 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e75e676-b47f-4633-9011-93f0bbc72b01" containerName="registry-server" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.287294 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e75e676-b47f-4633-9011-93f0bbc72b01" containerName="registry-server" Nov 29 15:52:00 crc kubenswrapper[4907]: E1129 15:52:00.287659 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e75e676-b47f-4633-9011-93f0bbc72b01" containerName="extract-utilities" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.287685 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e75e676-b47f-4633-9011-93f0bbc72b01" containerName="extract-utilities" Nov 29 15:52:00 crc kubenswrapper[4907]: E1129 15:52:00.287767 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e75e676-b47f-4633-9011-93f0bbc72b01" containerName="extract-content" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.287782 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e75e676-b47f-4633-9011-93f0bbc72b01" containerName="extract-content" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.289094 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e75e676-b47f-4633-9011-93f0bbc72b01" containerName="registry-server" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.293166 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.417858 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf3d105a-80f3-42a4-b242-6dd915b824c2-utilities\") pod \"redhat-marketplace-bdnqp\" (UID: \"bf3d105a-80f3-42a4-b242-6dd915b824c2\") " pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.418039 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kb56\" (UniqueName: \"kubernetes.io/projected/bf3d105a-80f3-42a4-b242-6dd915b824c2-kube-api-access-2kb56\") pod \"redhat-marketplace-bdnqp\" (UID: \"bf3d105a-80f3-42a4-b242-6dd915b824c2\") " pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.418157 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf3d105a-80f3-42a4-b242-6dd915b824c2-catalog-content\") pod \"redhat-marketplace-bdnqp\" (UID: \"bf3d105a-80f3-42a4-b242-6dd915b824c2\") " pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.493336 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdnqp"] Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.520331 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf3d105a-80f3-42a4-b242-6dd915b824c2-utilities\") pod \"redhat-marketplace-bdnqp\" (UID: \"bf3d105a-80f3-42a4-b242-6dd915b824c2\") " pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.520749 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kb56\" (UniqueName: \"kubernetes.io/projected/bf3d105a-80f3-42a4-b242-6dd915b824c2-kube-api-access-2kb56\") pod \"redhat-marketplace-bdnqp\" (UID: \"bf3d105a-80f3-42a4-b242-6dd915b824c2\") " pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.521170 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf3d105a-80f3-42a4-b242-6dd915b824c2-catalog-content\") pod \"redhat-marketplace-bdnqp\" (UID: \"bf3d105a-80f3-42a4-b242-6dd915b824c2\") " pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.523869 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf3d105a-80f3-42a4-b242-6dd915b824c2-utilities\") pod \"redhat-marketplace-bdnqp\" (UID: \"bf3d105a-80f3-42a4-b242-6dd915b824c2\") " pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.524188 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf3d105a-80f3-42a4-b242-6dd915b824c2-catalog-content\") pod \"redhat-marketplace-bdnqp\" (UID: \"bf3d105a-80f3-42a4-b242-6dd915b824c2\") " pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.639465 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kb56\" (UniqueName: \"kubernetes.io/projected/bf3d105a-80f3-42a4-b242-6dd915b824c2-kube-api-access-2kb56\") pod \"redhat-marketplace-bdnqp\" (UID: \"bf3d105a-80f3-42a4-b242-6dd915b824c2\") " pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:00 crc kubenswrapper[4907]: I1129 15:52:00.645754 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:01 crc kubenswrapper[4907]: I1129 15:52:01.444643 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdnqp"] Nov 29 15:52:01 crc kubenswrapper[4907]: I1129 15:52:01.882451 4907 generic.go:334] "Generic (PLEG): container finished" podID="bf3d105a-80f3-42a4-b242-6dd915b824c2" containerID="b14d342452b248d4ca769978da82c5e7800a7ce39cb458ebbf7bc6bf414dd6b6" exitCode=0 Nov 29 15:52:01 crc kubenswrapper[4907]: I1129 15:52:01.882584 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdnqp" event={"ID":"bf3d105a-80f3-42a4-b242-6dd915b824c2","Type":"ContainerDied","Data":"b14d342452b248d4ca769978da82c5e7800a7ce39cb458ebbf7bc6bf414dd6b6"} Nov 29 15:52:01 crc kubenswrapper[4907]: I1129 15:52:01.883082 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdnqp" event={"ID":"bf3d105a-80f3-42a4-b242-6dd915b824c2","Type":"ContainerStarted","Data":"4e382060851f23430c98677515a0a19df851d588be4e3a84eed40a162ef1d3db"} Nov 29 15:52:03 crc kubenswrapper[4907]: I1129 15:52:03.971983 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdnqp" event={"ID":"bf3d105a-80f3-42a4-b242-6dd915b824c2","Type":"ContainerStarted","Data":"1dc4a2b9f769d1529dfb00764e394b1a25939afe4be735ba3009cfdb8417d23f"} Nov 29 15:52:04 crc kubenswrapper[4907]: I1129 15:52:04.985607 4907 generic.go:334] "Generic (PLEG): container finished" podID="bf3d105a-80f3-42a4-b242-6dd915b824c2" containerID="1dc4a2b9f769d1529dfb00764e394b1a25939afe4be735ba3009cfdb8417d23f" exitCode=0 Nov 29 15:52:04 crc kubenswrapper[4907]: I1129 15:52:04.985923 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdnqp" event={"ID":"bf3d105a-80f3-42a4-b242-6dd915b824c2","Type":"ContainerDied","Data":"1dc4a2b9f769d1529dfb00764e394b1a25939afe4be735ba3009cfdb8417d23f"} Nov 29 15:52:06 crc kubenswrapper[4907]: I1129 15:52:06.001888 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdnqp" event={"ID":"bf3d105a-80f3-42a4-b242-6dd915b824c2","Type":"ContainerStarted","Data":"c643de716f4f95a5f7d720a087cb2e72a3156192375d6c11169ce6ea2444aa3b"} Nov 29 15:52:06 crc kubenswrapper[4907]: I1129 15:52:06.032251 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bdnqp" podStartSLOduration=2.401194755 podStartE2EDuration="6.031463687s" podCreationTimestamp="2025-11-29 15:52:00 +0000 UTC" firstStartedPulling="2025-11-29 15:52:01.886276746 +0000 UTC m=+5019.873114398" lastFinishedPulling="2025-11-29 15:52:05.516545678 +0000 UTC m=+5023.503383330" observedRunningTime="2025-11-29 15:52:06.029470301 +0000 UTC m=+5024.016307963" watchObservedRunningTime="2025-11-29 15:52:06.031463687 +0000 UTC m=+5024.018301349" Nov 29 15:52:10 crc kubenswrapper[4907]: I1129 15:52:10.646671 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:10 crc kubenswrapper[4907]: I1129 15:52:10.648524 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:10 crc kubenswrapper[4907]: I1129 15:52:10.732356 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:11 crc kubenswrapper[4907]: I1129 15:52:11.118217 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:11 crc kubenswrapper[4907]: I1129 15:52:11.189553 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdnqp"] Nov 29 15:52:13 crc kubenswrapper[4907]: I1129 15:52:13.079280 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bdnqp" podUID="bf3d105a-80f3-42a4-b242-6dd915b824c2" containerName="registry-server" containerID="cri-o://c643de716f4f95a5f7d720a087cb2e72a3156192375d6c11169ce6ea2444aa3b" gracePeriod=2 Nov 29 15:52:13 crc kubenswrapper[4907]: I1129 15:52:13.798919 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:13 crc kubenswrapper[4907]: I1129 15:52:13.909077 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf3d105a-80f3-42a4-b242-6dd915b824c2-catalog-content\") pod \"bf3d105a-80f3-42a4-b242-6dd915b824c2\" (UID: \"bf3d105a-80f3-42a4-b242-6dd915b824c2\") " Nov 29 15:52:13 crc kubenswrapper[4907]: I1129 15:52:13.909342 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kb56\" (UniqueName: \"kubernetes.io/projected/bf3d105a-80f3-42a4-b242-6dd915b824c2-kube-api-access-2kb56\") pod \"bf3d105a-80f3-42a4-b242-6dd915b824c2\" (UID: \"bf3d105a-80f3-42a4-b242-6dd915b824c2\") " Nov 29 15:52:13 crc kubenswrapper[4907]: I1129 15:52:13.909510 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf3d105a-80f3-42a4-b242-6dd915b824c2-utilities\") pod \"bf3d105a-80f3-42a4-b242-6dd915b824c2\" (UID: \"bf3d105a-80f3-42a4-b242-6dd915b824c2\") " Nov 29 15:52:13 crc kubenswrapper[4907]: I1129 15:52:13.912029 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3d105a-80f3-42a4-b242-6dd915b824c2-utilities" (OuterVolumeSpecName: "utilities") pod "bf3d105a-80f3-42a4-b242-6dd915b824c2" (UID: "bf3d105a-80f3-42a4-b242-6dd915b824c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:52:13 crc kubenswrapper[4907]: I1129 15:52:13.924244 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3d105a-80f3-42a4-b242-6dd915b824c2-kube-api-access-2kb56" (OuterVolumeSpecName: "kube-api-access-2kb56") pod "bf3d105a-80f3-42a4-b242-6dd915b824c2" (UID: "bf3d105a-80f3-42a4-b242-6dd915b824c2"). InnerVolumeSpecName "kube-api-access-2kb56". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:52:13 crc kubenswrapper[4907]: I1129 15:52:13.932594 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3d105a-80f3-42a4-b242-6dd915b824c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf3d105a-80f3-42a4-b242-6dd915b824c2" (UID: "bf3d105a-80f3-42a4-b242-6dd915b824c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.011504 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf3d105a-80f3-42a4-b242-6dd915b824c2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.011720 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kb56\" (UniqueName: \"kubernetes.io/projected/bf3d105a-80f3-42a4-b242-6dd915b824c2-kube-api-access-2kb56\") on node \"crc\" DevicePath \"\"" Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.011800 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf3d105a-80f3-42a4-b242-6dd915b824c2-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.092536 4907 generic.go:334] "Generic (PLEG): container finished" podID="bf3d105a-80f3-42a4-b242-6dd915b824c2" containerID="c643de716f4f95a5f7d720a087cb2e72a3156192375d6c11169ce6ea2444aa3b" exitCode=0 Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.092581 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdnqp" event={"ID":"bf3d105a-80f3-42a4-b242-6dd915b824c2","Type":"ContainerDied","Data":"c643de716f4f95a5f7d720a087cb2e72a3156192375d6c11169ce6ea2444aa3b"} Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.092610 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdnqp" event={"ID":"bf3d105a-80f3-42a4-b242-6dd915b824c2","Type":"ContainerDied","Data":"4e382060851f23430c98677515a0a19df851d588be4e3a84eed40a162ef1d3db"} Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.092632 4907 scope.go:117] "RemoveContainer" containerID="c643de716f4f95a5f7d720a087cb2e72a3156192375d6c11169ce6ea2444aa3b" Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.093488 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdnqp" Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.121423 4907 scope.go:117] "RemoveContainer" containerID="1dc4a2b9f769d1529dfb00764e394b1a25939afe4be735ba3009cfdb8417d23f" Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.147613 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdnqp"] Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.161660 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdnqp"] Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.172024 4907 scope.go:117] "RemoveContainer" containerID="b14d342452b248d4ca769978da82c5e7800a7ce39cb458ebbf7bc6bf414dd6b6" Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.226419 4907 scope.go:117] "RemoveContainer" containerID="c643de716f4f95a5f7d720a087cb2e72a3156192375d6c11169ce6ea2444aa3b" Nov 29 15:52:14 crc kubenswrapper[4907]: E1129 15:52:14.229735 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c643de716f4f95a5f7d720a087cb2e72a3156192375d6c11169ce6ea2444aa3b\": container with ID starting with c643de716f4f95a5f7d720a087cb2e72a3156192375d6c11169ce6ea2444aa3b not found: ID does not exist" containerID="c643de716f4f95a5f7d720a087cb2e72a3156192375d6c11169ce6ea2444aa3b" Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.229993 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c643de716f4f95a5f7d720a087cb2e72a3156192375d6c11169ce6ea2444aa3b"} err="failed to get container status \"c643de716f4f95a5f7d720a087cb2e72a3156192375d6c11169ce6ea2444aa3b\": rpc error: code = NotFound desc = could not find container \"c643de716f4f95a5f7d720a087cb2e72a3156192375d6c11169ce6ea2444aa3b\": container with ID starting with c643de716f4f95a5f7d720a087cb2e72a3156192375d6c11169ce6ea2444aa3b not found: ID does not exist" Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.230031 4907 scope.go:117] "RemoveContainer" containerID="1dc4a2b9f769d1529dfb00764e394b1a25939afe4be735ba3009cfdb8417d23f" Nov 29 15:52:14 crc kubenswrapper[4907]: E1129 15:52:14.230310 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dc4a2b9f769d1529dfb00764e394b1a25939afe4be735ba3009cfdb8417d23f\": container with ID starting with 1dc4a2b9f769d1529dfb00764e394b1a25939afe4be735ba3009cfdb8417d23f not found: ID does not exist" containerID="1dc4a2b9f769d1529dfb00764e394b1a25939afe4be735ba3009cfdb8417d23f" Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.230341 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc4a2b9f769d1529dfb00764e394b1a25939afe4be735ba3009cfdb8417d23f"} err="failed to get container status \"1dc4a2b9f769d1529dfb00764e394b1a25939afe4be735ba3009cfdb8417d23f\": rpc error: code = NotFound desc = could not find container \"1dc4a2b9f769d1529dfb00764e394b1a25939afe4be735ba3009cfdb8417d23f\": container with ID starting with 1dc4a2b9f769d1529dfb00764e394b1a25939afe4be735ba3009cfdb8417d23f not found: ID does not exist" Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.230370 4907 scope.go:117] "RemoveContainer" containerID="b14d342452b248d4ca769978da82c5e7800a7ce39cb458ebbf7bc6bf414dd6b6" Nov 29 15:52:14 crc kubenswrapper[4907]: E1129 15:52:14.231402 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b14d342452b248d4ca769978da82c5e7800a7ce39cb458ebbf7bc6bf414dd6b6\": container with ID starting with b14d342452b248d4ca769978da82c5e7800a7ce39cb458ebbf7bc6bf414dd6b6 not found: ID does not exist" containerID="b14d342452b248d4ca769978da82c5e7800a7ce39cb458ebbf7bc6bf414dd6b6" Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.231460 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b14d342452b248d4ca769978da82c5e7800a7ce39cb458ebbf7bc6bf414dd6b6"} err="failed to get container status \"b14d342452b248d4ca769978da82c5e7800a7ce39cb458ebbf7bc6bf414dd6b6\": rpc error: code = NotFound desc = could not find container \"b14d342452b248d4ca769978da82c5e7800a7ce39cb458ebbf7bc6bf414dd6b6\": container with ID starting with b14d342452b248d4ca769978da82c5e7800a7ce39cb458ebbf7bc6bf414dd6b6 not found: ID does not exist" Nov 29 15:52:14 crc kubenswrapper[4907]: I1129 15:52:14.494223 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf3d105a-80f3-42a4-b242-6dd915b824c2" path="/var/lib/kubelet/pods/bf3d105a-80f3-42a4-b242-6dd915b824c2/volumes" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.069092 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cczl6"] Nov 29 15:52:22 crc kubenswrapper[4907]: E1129 15:52:22.070223 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3d105a-80f3-42a4-b242-6dd915b824c2" containerName="extract-content" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.070238 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3d105a-80f3-42a4-b242-6dd915b824c2" containerName="extract-content" Nov 29 15:52:22 crc kubenswrapper[4907]: E1129 15:52:22.070258 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3d105a-80f3-42a4-b242-6dd915b824c2" containerName="extract-utilities" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.070264 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3d105a-80f3-42a4-b242-6dd915b824c2" containerName="extract-utilities" Nov 29 15:52:22 crc kubenswrapper[4907]: E1129 15:52:22.070309 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3d105a-80f3-42a4-b242-6dd915b824c2" containerName="registry-server" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.070315 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3d105a-80f3-42a4-b242-6dd915b824c2" containerName="registry-server" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.070580 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3d105a-80f3-42a4-b242-6dd915b824c2" containerName="registry-server" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.074498 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.118715 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cczl6"] Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.241792 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmhfv\" (UniqueName: \"kubernetes.io/projected/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-kube-api-access-xmhfv\") pod \"redhat-operators-cczl6\" (UID: \"d37c8b99-e8b9-476d-aee2-240e1eaba0c8\") " pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.241871 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-catalog-content\") pod \"redhat-operators-cczl6\" (UID: \"d37c8b99-e8b9-476d-aee2-240e1eaba0c8\") " pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.241894 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-utilities\") pod \"redhat-operators-cczl6\" (UID: \"d37c8b99-e8b9-476d-aee2-240e1eaba0c8\") " pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.344195 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmhfv\" (UniqueName: \"kubernetes.io/projected/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-kube-api-access-xmhfv\") pod \"redhat-operators-cczl6\" (UID: \"d37c8b99-e8b9-476d-aee2-240e1eaba0c8\") " pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.344290 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-catalog-content\") pod \"redhat-operators-cczl6\" (UID: \"d37c8b99-e8b9-476d-aee2-240e1eaba0c8\") " pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.344322 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-utilities\") pod \"redhat-operators-cczl6\" (UID: \"d37c8b99-e8b9-476d-aee2-240e1eaba0c8\") " pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.346065 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-utilities\") pod \"redhat-operators-cczl6\" (UID: \"d37c8b99-e8b9-476d-aee2-240e1eaba0c8\") " pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.346080 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-catalog-content\") pod \"redhat-operators-cczl6\" (UID: \"d37c8b99-e8b9-476d-aee2-240e1eaba0c8\") " pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.396309 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmhfv\" (UniqueName: \"kubernetes.io/projected/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-kube-api-access-xmhfv\") pod \"redhat-operators-cczl6\" (UID: \"d37c8b99-e8b9-476d-aee2-240e1eaba0c8\") " pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:22 crc kubenswrapper[4907]: I1129 15:52:22.440308 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:23 crc kubenswrapper[4907]: I1129 15:52:23.016539 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cczl6"] Nov 29 15:52:23 crc kubenswrapper[4907]: I1129 15:52:23.231250 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cczl6" event={"ID":"d37c8b99-e8b9-476d-aee2-240e1eaba0c8","Type":"ContainerStarted","Data":"7aa093e620495b77fa570a9555e2fbb21b8d15b630ffd42cd96567e387ab3ef3"} Nov 29 15:52:24 crc kubenswrapper[4907]: I1129 15:52:24.250111 4907 generic.go:334] "Generic (PLEG): container finished" podID="d37c8b99-e8b9-476d-aee2-240e1eaba0c8" containerID="2d779d9dd34c1522424c6df9c352029d7f0c2acc02590c647c823760444643e9" exitCode=0 Nov 29 15:52:24 crc kubenswrapper[4907]: I1129 15:52:24.250163 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cczl6" event={"ID":"d37c8b99-e8b9-476d-aee2-240e1eaba0c8","Type":"ContainerDied","Data":"2d779d9dd34c1522424c6df9c352029d7f0c2acc02590c647c823760444643e9"} Nov 29 15:52:24 crc kubenswrapper[4907]: I1129 15:52:24.270005 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-shr9l"] Nov 29 15:52:24 crc kubenswrapper[4907]: I1129 15:52:24.272742 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:24 crc kubenswrapper[4907]: I1129 15:52:24.284536 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shr9l"] Nov 29 15:52:24 crc kubenswrapper[4907]: I1129 15:52:24.395837 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bb975d1-ad5c-4850-be1d-266cc281dacd-utilities\") pod \"certified-operators-shr9l\" (UID: \"1bb975d1-ad5c-4850-be1d-266cc281dacd\") " pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:24 crc kubenswrapper[4907]: I1129 15:52:24.395935 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bb975d1-ad5c-4850-be1d-266cc281dacd-catalog-content\") pod \"certified-operators-shr9l\" (UID: \"1bb975d1-ad5c-4850-be1d-266cc281dacd\") " pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:24 crc kubenswrapper[4907]: I1129 15:52:24.396282 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjj8q\" (UniqueName: \"kubernetes.io/projected/1bb975d1-ad5c-4850-be1d-266cc281dacd-kube-api-access-cjj8q\") pod \"certified-operators-shr9l\" (UID: \"1bb975d1-ad5c-4850-be1d-266cc281dacd\") " pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:24 crc kubenswrapper[4907]: I1129 15:52:24.498802 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjj8q\" (UniqueName: \"kubernetes.io/projected/1bb975d1-ad5c-4850-be1d-266cc281dacd-kube-api-access-cjj8q\") pod \"certified-operators-shr9l\" (UID: \"1bb975d1-ad5c-4850-be1d-266cc281dacd\") " pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:24 crc kubenswrapper[4907]: I1129 15:52:24.499002 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bb975d1-ad5c-4850-be1d-266cc281dacd-utilities\") pod \"certified-operators-shr9l\" (UID: \"1bb975d1-ad5c-4850-be1d-266cc281dacd\") " pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:24 crc kubenswrapper[4907]: I1129 15:52:24.499048 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bb975d1-ad5c-4850-be1d-266cc281dacd-catalog-content\") pod \"certified-operators-shr9l\" (UID: \"1bb975d1-ad5c-4850-be1d-266cc281dacd\") " pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:24 crc kubenswrapper[4907]: I1129 15:52:24.500769 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bb975d1-ad5c-4850-be1d-266cc281dacd-catalog-content\") pod \"certified-operators-shr9l\" (UID: \"1bb975d1-ad5c-4850-be1d-266cc281dacd\") " pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:24 crc kubenswrapper[4907]: I1129 15:52:24.500919 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bb975d1-ad5c-4850-be1d-266cc281dacd-utilities\") pod \"certified-operators-shr9l\" (UID: \"1bb975d1-ad5c-4850-be1d-266cc281dacd\") " pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:24 crc kubenswrapper[4907]: I1129 15:52:24.533923 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjj8q\" (UniqueName: \"kubernetes.io/projected/1bb975d1-ad5c-4850-be1d-266cc281dacd-kube-api-access-cjj8q\") pod \"certified-operators-shr9l\" (UID: \"1bb975d1-ad5c-4850-be1d-266cc281dacd\") " pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:24 crc kubenswrapper[4907]: I1129 15:52:24.603539 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:25 crc kubenswrapper[4907]: I1129 15:52:25.193043 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shr9l"] Nov 29 15:52:25 crc kubenswrapper[4907]: I1129 15:52:25.259733 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shr9l" event={"ID":"1bb975d1-ad5c-4850-be1d-266cc281dacd","Type":"ContainerStarted","Data":"161fbb050d18cea1d332e89aa80b660864eb0dcde696de7a575c93d2ec23a740"} Nov 29 15:52:26 crc kubenswrapper[4907]: I1129 15:52:26.277337 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cczl6" event={"ID":"d37c8b99-e8b9-476d-aee2-240e1eaba0c8","Type":"ContainerStarted","Data":"2e1c7b662a643f97ed93d380395712aad505d6bedb1e2683837b6dcb40626ddd"} Nov 29 15:52:26 crc kubenswrapper[4907]: I1129 15:52:26.289048 4907 generic.go:334] "Generic (PLEG): container finished" podID="1bb975d1-ad5c-4850-be1d-266cc281dacd" containerID="3dbc31782d8d3c7c3a7554a97832956d5316045abcd27dd09b4e9e6a2621ce2f" exitCode=0 Nov 29 15:52:26 crc kubenswrapper[4907]: I1129 15:52:26.289100 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shr9l" event={"ID":"1bb975d1-ad5c-4850-be1d-266cc281dacd","Type":"ContainerDied","Data":"3dbc31782d8d3c7c3a7554a97832956d5316045abcd27dd09b4e9e6a2621ce2f"} Nov 29 15:52:28 crc kubenswrapper[4907]: I1129 15:52:28.309311 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shr9l" event={"ID":"1bb975d1-ad5c-4850-be1d-266cc281dacd","Type":"ContainerStarted","Data":"4b241632270cf7da20c5f9bf80315814aa698f5085e6ae492ccbff9ea7017ad9"} Nov 29 15:52:28 crc kubenswrapper[4907]: I1129 15:52:28.602305 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:52:28 crc kubenswrapper[4907]: I1129 15:52:28.603029 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 15:52:28 crc kubenswrapper[4907]: I1129 15:52:28.603112 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 15:52:28 crc kubenswrapper[4907]: I1129 15:52:28.604376 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 15:52:28 crc kubenswrapper[4907]: I1129 15:52:28.604516 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" gracePeriod=600 Nov 29 15:52:28 crc kubenswrapper[4907]: E1129 15:52:28.861523 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:52:29 crc kubenswrapper[4907]: I1129 15:52:29.361299 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" exitCode=0 Nov 29 15:52:29 crc kubenswrapper[4907]: I1129 15:52:29.361416 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317"} Nov 29 15:52:29 crc kubenswrapper[4907]: I1129 15:52:29.361801 4907 scope.go:117] "RemoveContainer" containerID="6e24dcbb8c9e438c67ce623147f1bd4b3d9b6b2886bc5d2a86b53d4411cf22f8" Nov 29 15:52:29 crc kubenswrapper[4907]: I1129 15:52:29.362431 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:52:29 crc kubenswrapper[4907]: E1129 15:52:29.362898 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:52:31 crc kubenswrapper[4907]: I1129 15:52:31.389995 4907 generic.go:334] "Generic (PLEG): container finished" podID="d37c8b99-e8b9-476d-aee2-240e1eaba0c8" containerID="2e1c7b662a643f97ed93d380395712aad505d6bedb1e2683837b6dcb40626ddd" exitCode=0 Nov 29 15:52:31 crc kubenswrapper[4907]: I1129 15:52:31.390121 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cczl6" event={"ID":"d37c8b99-e8b9-476d-aee2-240e1eaba0c8","Type":"ContainerDied","Data":"2e1c7b662a643f97ed93d380395712aad505d6bedb1e2683837b6dcb40626ddd"} Nov 29 15:52:31 crc kubenswrapper[4907]: I1129 15:52:31.392727 4907 generic.go:334] "Generic (PLEG): container finished" podID="1bb975d1-ad5c-4850-be1d-266cc281dacd" containerID="4b241632270cf7da20c5f9bf80315814aa698f5085e6ae492ccbff9ea7017ad9" exitCode=0 Nov 29 15:52:31 crc kubenswrapper[4907]: I1129 15:52:31.392770 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shr9l" event={"ID":"1bb975d1-ad5c-4850-be1d-266cc281dacd","Type":"ContainerDied","Data":"4b241632270cf7da20c5f9bf80315814aa698f5085e6ae492ccbff9ea7017ad9"} Nov 29 15:52:32 crc kubenswrapper[4907]: I1129 15:52:32.403346 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cczl6" event={"ID":"d37c8b99-e8b9-476d-aee2-240e1eaba0c8","Type":"ContainerStarted","Data":"953b77943f8b813703d5e2b81e96a143c5be20759a1b18f37fa98fd35baee504"} Nov 29 15:52:32 crc kubenswrapper[4907]: I1129 15:52:32.409140 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shr9l" event={"ID":"1bb975d1-ad5c-4850-be1d-266cc281dacd","Type":"ContainerStarted","Data":"062990d37eae07a64d38ad3c7dd1d43136c67250a73b8b619106860f9af415dc"} Nov 29 15:52:32 crc kubenswrapper[4907]: I1129 15:52:32.440412 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cczl6" podStartSLOduration=2.627855344 podStartE2EDuration="10.440393363s" podCreationTimestamp="2025-11-29 15:52:22 +0000 UTC" firstStartedPulling="2025-11-29 15:52:24.254758057 +0000 UTC m=+5042.241595709" lastFinishedPulling="2025-11-29 15:52:32.067296076 +0000 UTC m=+5050.054133728" observedRunningTime="2025-11-29 15:52:32.44029157 +0000 UTC m=+5050.427129232" watchObservedRunningTime="2025-11-29 15:52:32.440393363 +0000 UTC m=+5050.427231025" Nov 29 15:52:32 crc kubenswrapper[4907]: I1129 15:52:32.440708 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:32 crc kubenswrapper[4907]: I1129 15:52:32.440750 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:32 crc kubenswrapper[4907]: I1129 15:52:32.471137 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-shr9l" podStartSLOduration=2.9067066329999998 podStartE2EDuration="8.4711161s" podCreationTimestamp="2025-11-29 15:52:24 +0000 UTC" firstStartedPulling="2025-11-29 15:52:26.293068009 +0000 UTC m=+5044.279905661" lastFinishedPulling="2025-11-29 15:52:31.857477476 +0000 UTC m=+5049.844315128" observedRunningTime="2025-11-29 15:52:32.464626817 +0000 UTC m=+5050.451464469" watchObservedRunningTime="2025-11-29 15:52:32.4711161 +0000 UTC m=+5050.457953752" Nov 29 15:52:33 crc kubenswrapper[4907]: I1129 15:52:33.504727 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cczl6" podUID="d37c8b99-e8b9-476d-aee2-240e1eaba0c8" containerName="registry-server" probeResult="failure" output=< Nov 29 15:52:33 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 15:52:33 crc kubenswrapper[4907]: > Nov 29 15:52:34 crc kubenswrapper[4907]: I1129 15:52:34.604517 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:34 crc kubenswrapper[4907]: I1129 15:52:34.604808 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:35 crc kubenswrapper[4907]: I1129 15:52:35.667359 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-shr9l" podUID="1bb975d1-ad5c-4850-be1d-266cc281dacd" containerName="registry-server" probeResult="failure" output=< Nov 29 15:52:35 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 15:52:35 crc kubenswrapper[4907]: > Nov 29 15:52:40 crc kubenswrapper[4907]: I1129 15:52:40.479829 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:52:40 crc kubenswrapper[4907]: E1129 15:52:40.480482 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:52:43 crc kubenswrapper[4907]: I1129 15:52:43.499260 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cczl6" podUID="d37c8b99-e8b9-476d-aee2-240e1eaba0c8" containerName="registry-server" probeResult="failure" output=< Nov 29 15:52:43 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 15:52:43 crc kubenswrapper[4907]: > Nov 29 15:52:44 crc kubenswrapper[4907]: I1129 15:52:44.687043 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:44 crc kubenswrapper[4907]: I1129 15:52:44.765785 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:44 crc kubenswrapper[4907]: I1129 15:52:44.927636 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shr9l"] Nov 29 15:52:46 crc kubenswrapper[4907]: I1129 15:52:46.580347 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-shr9l" podUID="1bb975d1-ad5c-4850-be1d-266cc281dacd" containerName="registry-server" containerID="cri-o://062990d37eae07a64d38ad3c7dd1d43136c67250a73b8b619106860f9af415dc" gracePeriod=2 Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.394474 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.500502 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bb975d1-ad5c-4850-be1d-266cc281dacd-catalog-content\") pod \"1bb975d1-ad5c-4850-be1d-266cc281dacd\" (UID: \"1bb975d1-ad5c-4850-be1d-266cc281dacd\") " Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.500717 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bb975d1-ad5c-4850-be1d-266cc281dacd-utilities\") pod \"1bb975d1-ad5c-4850-be1d-266cc281dacd\" (UID: \"1bb975d1-ad5c-4850-be1d-266cc281dacd\") " Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.500783 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjj8q\" (UniqueName: \"kubernetes.io/projected/1bb975d1-ad5c-4850-be1d-266cc281dacd-kube-api-access-cjj8q\") pod \"1bb975d1-ad5c-4850-be1d-266cc281dacd\" (UID: \"1bb975d1-ad5c-4850-be1d-266cc281dacd\") " Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.502366 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bb975d1-ad5c-4850-be1d-266cc281dacd-utilities" (OuterVolumeSpecName: "utilities") pod "1bb975d1-ad5c-4850-be1d-266cc281dacd" (UID: "1bb975d1-ad5c-4850-be1d-266cc281dacd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.519007 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bb975d1-ad5c-4850-be1d-266cc281dacd-kube-api-access-cjj8q" (OuterVolumeSpecName: "kube-api-access-cjj8q") pod "1bb975d1-ad5c-4850-be1d-266cc281dacd" (UID: "1bb975d1-ad5c-4850-be1d-266cc281dacd"). InnerVolumeSpecName "kube-api-access-cjj8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.576978 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bb975d1-ad5c-4850-be1d-266cc281dacd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1bb975d1-ad5c-4850-be1d-266cc281dacd" (UID: "1bb975d1-ad5c-4850-be1d-266cc281dacd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.596516 4907 generic.go:334] "Generic (PLEG): container finished" podID="1bb975d1-ad5c-4850-be1d-266cc281dacd" containerID="062990d37eae07a64d38ad3c7dd1d43136c67250a73b8b619106860f9af415dc" exitCode=0 Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.596660 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shr9l" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.596675 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shr9l" event={"ID":"1bb975d1-ad5c-4850-be1d-266cc281dacd","Type":"ContainerDied","Data":"062990d37eae07a64d38ad3c7dd1d43136c67250a73b8b619106860f9af415dc"} Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.596729 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shr9l" event={"ID":"1bb975d1-ad5c-4850-be1d-266cc281dacd","Type":"ContainerDied","Data":"161fbb050d18cea1d332e89aa80b660864eb0dcde696de7a575c93d2ec23a740"} Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.596754 4907 scope.go:117] "RemoveContainer" containerID="062990d37eae07a64d38ad3c7dd1d43136c67250a73b8b619106860f9af415dc" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.608633 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bb975d1-ad5c-4850-be1d-266cc281dacd-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.608676 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bb975d1-ad5c-4850-be1d-266cc281dacd-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.608693 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjj8q\" (UniqueName: \"kubernetes.io/projected/1bb975d1-ad5c-4850-be1d-266cc281dacd-kube-api-access-cjj8q\") on node \"crc\" DevicePath \"\"" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.636373 4907 scope.go:117] "RemoveContainer" containerID="4b241632270cf7da20c5f9bf80315814aa698f5085e6ae492ccbff9ea7017ad9" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.643290 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shr9l"] Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.658908 4907 scope.go:117] "RemoveContainer" containerID="3dbc31782d8d3c7c3a7554a97832956d5316045abcd27dd09b4e9e6a2621ce2f" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.671010 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-shr9l"] Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.714127 4907 scope.go:117] "RemoveContainer" containerID="062990d37eae07a64d38ad3c7dd1d43136c67250a73b8b619106860f9af415dc" Nov 29 15:52:47 crc kubenswrapper[4907]: E1129 15:52:47.715549 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"062990d37eae07a64d38ad3c7dd1d43136c67250a73b8b619106860f9af415dc\": container with ID starting with 062990d37eae07a64d38ad3c7dd1d43136c67250a73b8b619106860f9af415dc not found: ID does not exist" containerID="062990d37eae07a64d38ad3c7dd1d43136c67250a73b8b619106860f9af415dc" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.715758 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"062990d37eae07a64d38ad3c7dd1d43136c67250a73b8b619106860f9af415dc"} err="failed to get container status \"062990d37eae07a64d38ad3c7dd1d43136c67250a73b8b619106860f9af415dc\": rpc error: code = NotFound desc = could not find container \"062990d37eae07a64d38ad3c7dd1d43136c67250a73b8b619106860f9af415dc\": container with ID starting with 062990d37eae07a64d38ad3c7dd1d43136c67250a73b8b619106860f9af415dc not found: ID does not exist" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.715787 4907 scope.go:117] "RemoveContainer" containerID="4b241632270cf7da20c5f9bf80315814aa698f5085e6ae492ccbff9ea7017ad9" Nov 29 15:52:47 crc kubenswrapper[4907]: E1129 15:52:47.716210 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b241632270cf7da20c5f9bf80315814aa698f5085e6ae492ccbff9ea7017ad9\": container with ID starting with 4b241632270cf7da20c5f9bf80315814aa698f5085e6ae492ccbff9ea7017ad9 not found: ID does not exist" containerID="4b241632270cf7da20c5f9bf80315814aa698f5085e6ae492ccbff9ea7017ad9" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.716242 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b241632270cf7da20c5f9bf80315814aa698f5085e6ae492ccbff9ea7017ad9"} err="failed to get container status \"4b241632270cf7da20c5f9bf80315814aa698f5085e6ae492ccbff9ea7017ad9\": rpc error: code = NotFound desc = could not find container \"4b241632270cf7da20c5f9bf80315814aa698f5085e6ae492ccbff9ea7017ad9\": container with ID starting with 4b241632270cf7da20c5f9bf80315814aa698f5085e6ae492ccbff9ea7017ad9 not found: ID does not exist" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.716257 4907 scope.go:117] "RemoveContainer" containerID="3dbc31782d8d3c7c3a7554a97832956d5316045abcd27dd09b4e9e6a2621ce2f" Nov 29 15:52:47 crc kubenswrapper[4907]: E1129 15:52:47.716508 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dbc31782d8d3c7c3a7554a97832956d5316045abcd27dd09b4e9e6a2621ce2f\": container with ID starting with 3dbc31782d8d3c7c3a7554a97832956d5316045abcd27dd09b4e9e6a2621ce2f not found: ID does not exist" containerID="3dbc31782d8d3c7c3a7554a97832956d5316045abcd27dd09b4e9e6a2621ce2f" Nov 29 15:52:47 crc kubenswrapper[4907]: I1129 15:52:47.716536 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dbc31782d8d3c7c3a7554a97832956d5316045abcd27dd09b4e9e6a2621ce2f"} err="failed to get container status \"3dbc31782d8d3c7c3a7554a97832956d5316045abcd27dd09b4e9e6a2621ce2f\": rpc error: code = NotFound desc = could not find container \"3dbc31782d8d3c7c3a7554a97832956d5316045abcd27dd09b4e9e6a2621ce2f\": container with ID starting with 3dbc31782d8d3c7c3a7554a97832956d5316045abcd27dd09b4e9e6a2621ce2f not found: ID does not exist" Nov 29 15:52:48 crc kubenswrapper[4907]: I1129 15:52:48.494869 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bb975d1-ad5c-4850-be1d-266cc281dacd" path="/var/lib/kubelet/pods/1bb975d1-ad5c-4850-be1d-266cc281dacd/volumes" Nov 29 15:52:52 crc kubenswrapper[4907]: I1129 15:52:52.714024 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:52 crc kubenswrapper[4907]: I1129 15:52:52.780285 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:53 crc kubenswrapper[4907]: I1129 15:52:53.268635 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cczl6"] Nov 29 15:52:53 crc kubenswrapper[4907]: I1129 15:52:53.760706 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cczl6" podUID="d37c8b99-e8b9-476d-aee2-240e1eaba0c8" containerName="registry-server" containerID="cri-o://953b77943f8b813703d5e2b81e96a143c5be20759a1b18f37fa98fd35baee504" gracePeriod=2 Nov 29 15:52:54 crc kubenswrapper[4907]: I1129 15:52:54.481658 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:52:54 crc kubenswrapper[4907]: E1129 15:52:54.482249 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:52:54 crc kubenswrapper[4907]: I1129 15:52:54.771426 4907 generic.go:334] "Generic (PLEG): container finished" podID="d37c8b99-e8b9-476d-aee2-240e1eaba0c8" containerID="953b77943f8b813703d5e2b81e96a143c5be20759a1b18f37fa98fd35baee504" exitCode=0 Nov 29 15:52:54 crc kubenswrapper[4907]: I1129 15:52:54.771549 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cczl6" event={"ID":"d37c8b99-e8b9-476d-aee2-240e1eaba0c8","Type":"ContainerDied","Data":"953b77943f8b813703d5e2b81e96a143c5be20759a1b18f37fa98fd35baee504"} Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.081624 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.210812 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-utilities\") pod \"d37c8b99-e8b9-476d-aee2-240e1eaba0c8\" (UID: \"d37c8b99-e8b9-476d-aee2-240e1eaba0c8\") " Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.210894 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-catalog-content\") pod \"d37c8b99-e8b9-476d-aee2-240e1eaba0c8\" (UID: \"d37c8b99-e8b9-476d-aee2-240e1eaba0c8\") " Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.210935 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmhfv\" (UniqueName: \"kubernetes.io/projected/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-kube-api-access-xmhfv\") pod \"d37c8b99-e8b9-476d-aee2-240e1eaba0c8\" (UID: \"d37c8b99-e8b9-476d-aee2-240e1eaba0c8\") " Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.212860 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-utilities" (OuterVolumeSpecName: "utilities") pod "d37c8b99-e8b9-476d-aee2-240e1eaba0c8" (UID: "d37c8b99-e8b9-476d-aee2-240e1eaba0c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.222931 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-kube-api-access-xmhfv" (OuterVolumeSpecName: "kube-api-access-xmhfv") pod "d37c8b99-e8b9-476d-aee2-240e1eaba0c8" (UID: "d37c8b99-e8b9-476d-aee2-240e1eaba0c8"). InnerVolumeSpecName "kube-api-access-xmhfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.314035 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.314068 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmhfv\" (UniqueName: \"kubernetes.io/projected/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-kube-api-access-xmhfv\") on node \"crc\" DevicePath \"\"" Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.337622 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d37c8b99-e8b9-476d-aee2-240e1eaba0c8" (UID: "d37c8b99-e8b9-476d-aee2-240e1eaba0c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.415996 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d37c8b99-e8b9-476d-aee2-240e1eaba0c8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.784576 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cczl6" event={"ID":"d37c8b99-e8b9-476d-aee2-240e1eaba0c8","Type":"ContainerDied","Data":"7aa093e620495b77fa570a9555e2fbb21b8d15b630ffd42cd96567e387ab3ef3"} Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.784645 4907 scope.go:117] "RemoveContainer" containerID="953b77943f8b813703d5e2b81e96a143c5be20759a1b18f37fa98fd35baee504" Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.784659 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cczl6" Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.833208 4907 scope.go:117] "RemoveContainer" containerID="2e1c7b662a643f97ed93d380395712aad505d6bedb1e2683837b6dcb40626ddd" Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.834686 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cczl6"] Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.846497 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cczl6"] Nov 29 15:52:55 crc kubenswrapper[4907]: I1129 15:52:55.857693 4907 scope.go:117] "RemoveContainer" containerID="2d779d9dd34c1522424c6df9c352029d7f0c2acc02590c647c823760444643e9" Nov 29 15:52:56 crc kubenswrapper[4907]: I1129 15:52:56.509788 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d37c8b99-e8b9-476d-aee2-240e1eaba0c8" path="/var/lib/kubelet/pods/d37c8b99-e8b9-476d-aee2-240e1eaba0c8/volumes" Nov 29 15:53:07 crc kubenswrapper[4907]: I1129 15:53:07.480851 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:53:07 crc kubenswrapper[4907]: E1129 15:53:07.482805 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:53:22 crc kubenswrapper[4907]: I1129 15:53:22.505387 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:53:22 crc kubenswrapper[4907]: E1129 15:53:22.506254 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:53:36 crc kubenswrapper[4907]: I1129 15:53:36.480527 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:53:36 crc kubenswrapper[4907]: E1129 15:53:36.483895 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:53:51 crc kubenswrapper[4907]: I1129 15:53:51.479681 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:53:51 crc kubenswrapper[4907]: E1129 15:53:51.480791 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:54:03 crc kubenswrapper[4907]: I1129 15:54:03.480302 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:54:03 crc kubenswrapper[4907]: E1129 15:54:03.480984 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:54:16 crc kubenswrapper[4907]: I1129 15:54:16.480257 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:54:16 crc kubenswrapper[4907]: E1129 15:54:16.480932 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:54:30 crc kubenswrapper[4907]: I1129 15:54:30.488133 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:54:30 crc kubenswrapper[4907]: E1129 15:54:30.489256 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:54:41 crc kubenswrapper[4907]: I1129 15:54:41.479943 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:54:41 crc kubenswrapper[4907]: E1129 15:54:41.480954 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:54:56 crc kubenswrapper[4907]: I1129 15:54:56.479883 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:54:56 crc kubenswrapper[4907]: E1129 15:54:56.480650 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:55:08 crc kubenswrapper[4907]: I1129 15:55:08.480572 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:55:08 crc kubenswrapper[4907]: E1129 15:55:08.481464 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:55:20 crc kubenswrapper[4907]: I1129 15:55:20.481012 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:55:20 crc kubenswrapper[4907]: E1129 15:55:20.482617 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:55:20 crc kubenswrapper[4907]: I1129 15:55:20.664328 4907 trace.go:236] Trace[2145725359]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-vqj4q" (29-Nov-2025 15:55:19.476) (total time: 1186ms): Nov 29 15:55:20 crc kubenswrapper[4907]: Trace[2145725359]: [1.186768566s] [1.186768566s] END Nov 29 15:55:35 crc kubenswrapper[4907]: I1129 15:55:35.480247 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:55:35 crc kubenswrapper[4907]: E1129 15:55:35.481838 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:55:46 crc kubenswrapper[4907]: I1129 15:55:46.483020 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:55:46 crc kubenswrapper[4907]: E1129 15:55:46.484400 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:55:59 crc kubenswrapper[4907]: I1129 15:55:59.481279 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:55:59 crc kubenswrapper[4907]: E1129 15:55:59.482516 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:56:10 crc kubenswrapper[4907]: I1129 15:56:10.479506 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:56:10 crc kubenswrapper[4907]: E1129 15:56:10.480556 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:56:24 crc kubenswrapper[4907]: I1129 15:56:24.479966 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:56:24 crc kubenswrapper[4907]: E1129 15:56:24.480910 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:56:38 crc kubenswrapper[4907]: I1129 15:56:38.479785 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:56:38 crc kubenswrapper[4907]: E1129 15:56:38.480883 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:56:53 crc kubenswrapper[4907]: I1129 15:56:53.480364 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:56:53 crc kubenswrapper[4907]: E1129 15:56:53.481670 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:57:06 crc kubenswrapper[4907]: I1129 15:57:06.479957 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:57:06 crc kubenswrapper[4907]: E1129 15:57:06.480844 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:57:21 crc kubenswrapper[4907]: I1129 15:57:21.481528 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:57:21 crc kubenswrapper[4907]: E1129 15:57:21.482710 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 15:57:35 crc kubenswrapper[4907]: I1129 15:57:35.479571 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 15:57:36 crc kubenswrapper[4907]: I1129 15:57:36.425500 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"45f44dad29ce701e8fc1cdcdaedb9a8d53599d4534d0ecec8bc3e9265b44b92a"} Nov 29 15:59:58 crc kubenswrapper[4907]: I1129 15:59:58.490462 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 15:59:58 crc kubenswrapper[4907]: I1129 15:59:58.491371 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.262407 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w"] Nov 29 16:00:00 crc kubenswrapper[4907]: E1129 16:00:00.263830 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d37c8b99-e8b9-476d-aee2-240e1eaba0c8" containerName="extract-utilities" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.263867 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d37c8b99-e8b9-476d-aee2-240e1eaba0c8" containerName="extract-utilities" Nov 29 16:00:00 crc kubenswrapper[4907]: E1129 16:00:00.263887 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb975d1-ad5c-4850-be1d-266cc281dacd" containerName="registry-server" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.263894 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb975d1-ad5c-4850-be1d-266cc281dacd" containerName="registry-server" Nov 29 16:00:00 crc kubenswrapper[4907]: E1129 16:00:00.263910 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb975d1-ad5c-4850-be1d-266cc281dacd" containerName="extract-content" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.263916 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb975d1-ad5c-4850-be1d-266cc281dacd" containerName="extract-content" Nov 29 16:00:00 crc kubenswrapper[4907]: E1129 16:00:00.263942 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d37c8b99-e8b9-476d-aee2-240e1eaba0c8" containerName="extract-content" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.263948 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d37c8b99-e8b9-476d-aee2-240e1eaba0c8" containerName="extract-content" Nov 29 16:00:00 crc kubenswrapper[4907]: E1129 16:00:00.263987 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d37c8b99-e8b9-476d-aee2-240e1eaba0c8" containerName="registry-server" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.263992 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d37c8b99-e8b9-476d-aee2-240e1eaba0c8" containerName="registry-server" Nov 29 16:00:00 crc kubenswrapper[4907]: E1129 16:00:00.264006 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb975d1-ad5c-4850-be1d-266cc281dacd" containerName="extract-utilities" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.264012 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb975d1-ad5c-4850-be1d-266cc281dacd" containerName="extract-utilities" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.264383 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb975d1-ad5c-4850-be1d-266cc281dacd" containerName="registry-server" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.264420 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d37c8b99-e8b9-476d-aee2-240e1eaba0c8" containerName="registry-server" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.269383 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.275590 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.275722 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.278636 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w"] Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.434866 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ceea489a-3992-4765-911f-a71b7b25491b-config-volume\") pod \"collect-profiles-29407200-r7d7w\" (UID: \"ceea489a-3992-4765-911f-a71b7b25491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.435350 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ceea489a-3992-4765-911f-a71b7b25491b-secret-volume\") pod \"collect-profiles-29407200-r7d7w\" (UID: \"ceea489a-3992-4765-911f-a71b7b25491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.435541 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-448q5\" (UniqueName: \"kubernetes.io/projected/ceea489a-3992-4765-911f-a71b7b25491b-kube-api-access-448q5\") pod \"collect-profiles-29407200-r7d7w\" (UID: \"ceea489a-3992-4765-911f-a71b7b25491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.539243 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ceea489a-3992-4765-911f-a71b7b25491b-config-volume\") pod \"collect-profiles-29407200-r7d7w\" (UID: \"ceea489a-3992-4765-911f-a71b7b25491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.539359 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ceea489a-3992-4765-911f-a71b7b25491b-secret-volume\") pod \"collect-profiles-29407200-r7d7w\" (UID: \"ceea489a-3992-4765-911f-a71b7b25491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.540031 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-448q5\" (UniqueName: \"kubernetes.io/projected/ceea489a-3992-4765-911f-a71b7b25491b-kube-api-access-448q5\") pod \"collect-profiles-29407200-r7d7w\" (UID: \"ceea489a-3992-4765-911f-a71b7b25491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.542219 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ceea489a-3992-4765-911f-a71b7b25491b-config-volume\") pod \"collect-profiles-29407200-r7d7w\" (UID: \"ceea489a-3992-4765-911f-a71b7b25491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.547287 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ceea489a-3992-4765-911f-a71b7b25491b-secret-volume\") pod \"collect-profiles-29407200-r7d7w\" (UID: \"ceea489a-3992-4765-911f-a71b7b25491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.576341 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-448q5\" (UniqueName: \"kubernetes.io/projected/ceea489a-3992-4765-911f-a71b7b25491b-kube-api-access-448q5\") pod \"collect-profiles-29407200-r7d7w\" (UID: \"ceea489a-3992-4765-911f-a71b7b25491b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" Nov 29 16:00:00 crc kubenswrapper[4907]: I1129 16:00:00.619607 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" Nov 29 16:00:01 crc kubenswrapper[4907]: I1129 16:00:01.313422 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w"] Nov 29 16:00:01 crc kubenswrapper[4907]: I1129 16:00:01.361216 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" event={"ID":"ceea489a-3992-4765-911f-a71b7b25491b","Type":"ContainerStarted","Data":"28e5f513eee3d5add8b583169c28aaeaa9c57f9598c7daf244a8c2f34537d1ed"} Nov 29 16:00:02 crc kubenswrapper[4907]: I1129 16:00:02.378167 4907 generic.go:334] "Generic (PLEG): container finished" podID="ceea489a-3992-4765-911f-a71b7b25491b" containerID="ac601602c399021c57da78169761a9ee6db3359b97f4bfc9a3d1d45e802f697f" exitCode=0 Nov 29 16:00:02 crc kubenswrapper[4907]: I1129 16:00:02.378274 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" event={"ID":"ceea489a-3992-4765-911f-a71b7b25491b","Type":"ContainerDied","Data":"ac601602c399021c57da78169761a9ee6db3359b97f4bfc9a3d1d45e802f697f"} Nov 29 16:00:03 crc kubenswrapper[4907]: I1129 16:00:03.804020 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" Nov 29 16:00:03 crc kubenswrapper[4907]: I1129 16:00:03.915142 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-448q5\" (UniqueName: \"kubernetes.io/projected/ceea489a-3992-4765-911f-a71b7b25491b-kube-api-access-448q5\") pod \"ceea489a-3992-4765-911f-a71b7b25491b\" (UID: \"ceea489a-3992-4765-911f-a71b7b25491b\") " Nov 29 16:00:03 crc kubenswrapper[4907]: I1129 16:00:03.915803 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ceea489a-3992-4765-911f-a71b7b25491b-config-volume\") pod \"ceea489a-3992-4765-911f-a71b7b25491b\" (UID: \"ceea489a-3992-4765-911f-a71b7b25491b\") " Nov 29 16:00:03 crc kubenswrapper[4907]: I1129 16:00:03.915988 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ceea489a-3992-4765-911f-a71b7b25491b-secret-volume\") pod \"ceea489a-3992-4765-911f-a71b7b25491b\" (UID: \"ceea489a-3992-4765-911f-a71b7b25491b\") " Nov 29 16:00:03 crc kubenswrapper[4907]: I1129 16:00:03.916970 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceea489a-3992-4765-911f-a71b7b25491b-config-volume" (OuterVolumeSpecName: "config-volume") pod "ceea489a-3992-4765-911f-a71b7b25491b" (UID: "ceea489a-3992-4765-911f-a71b7b25491b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 16:00:03 crc kubenswrapper[4907]: I1129 16:00:03.924748 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceea489a-3992-4765-911f-a71b7b25491b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ceea489a-3992-4765-911f-a71b7b25491b" (UID: "ceea489a-3992-4765-911f-a71b7b25491b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 16:00:03 crc kubenswrapper[4907]: I1129 16:00:03.925105 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceea489a-3992-4765-911f-a71b7b25491b-kube-api-access-448q5" (OuterVolumeSpecName: "kube-api-access-448q5") pod "ceea489a-3992-4765-911f-a71b7b25491b" (UID: "ceea489a-3992-4765-911f-a71b7b25491b"). InnerVolumeSpecName "kube-api-access-448q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:00:04 crc kubenswrapper[4907]: I1129 16:00:04.019008 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-448q5\" (UniqueName: \"kubernetes.io/projected/ceea489a-3992-4765-911f-a71b7b25491b-kube-api-access-448q5\") on node \"crc\" DevicePath \"\"" Nov 29 16:00:04 crc kubenswrapper[4907]: I1129 16:00:04.019040 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ceea489a-3992-4765-911f-a71b7b25491b-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 16:00:04 crc kubenswrapper[4907]: I1129 16:00:04.019049 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ceea489a-3992-4765-911f-a71b7b25491b-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 16:00:04 crc kubenswrapper[4907]: I1129 16:00:04.425683 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" event={"ID":"ceea489a-3992-4765-911f-a71b7b25491b","Type":"ContainerDied","Data":"28e5f513eee3d5add8b583169c28aaeaa9c57f9598c7daf244a8c2f34537d1ed"} Nov 29 16:00:04 crc kubenswrapper[4907]: I1129 16:00:04.425827 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407200-r7d7w" Nov 29 16:00:04 crc kubenswrapper[4907]: I1129 16:00:04.426934 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28e5f513eee3d5add8b583169c28aaeaa9c57f9598c7daf244a8c2f34537d1ed" Nov 29 16:00:04 crc kubenswrapper[4907]: I1129 16:00:04.905560 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd"] Nov 29 16:00:04 crc kubenswrapper[4907]: I1129 16:00:04.925481 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407155-jclsd"] Nov 29 16:00:06 crc kubenswrapper[4907]: I1129 16:00:06.497645 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a685f890-93f4-40fc-8613-d7c3a331d927" path="/var/lib/kubelet/pods/a685f890-93f4-40fc-8613-d7c3a331d927/volumes" Nov 29 16:00:28 crc kubenswrapper[4907]: I1129 16:00:28.490521 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 16:00:28 crc kubenswrapper[4907]: I1129 16:00:28.491015 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 16:00:31 crc kubenswrapper[4907]: I1129 16:00:31.403685 4907 scope.go:117] "RemoveContainer" containerID="6c401ed14e480d3d503abd95fb383f92378a30298d57d2ba717d04b1eb279dde" Nov 29 16:00:52 crc kubenswrapper[4907]: I1129 16:00:52.435118 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lmkq5"] Nov 29 16:00:52 crc kubenswrapper[4907]: E1129 16:00:52.436663 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceea489a-3992-4765-911f-a71b7b25491b" containerName="collect-profiles" Nov 29 16:00:52 crc kubenswrapper[4907]: I1129 16:00:52.436687 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceea489a-3992-4765-911f-a71b7b25491b" containerName="collect-profiles" Nov 29 16:00:52 crc kubenswrapper[4907]: I1129 16:00:52.437157 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceea489a-3992-4765-911f-a71b7b25491b" containerName="collect-profiles" Nov 29 16:00:52 crc kubenswrapper[4907]: I1129 16:00:52.440630 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:00:52 crc kubenswrapper[4907]: I1129 16:00:52.453320 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lmkq5"] Nov 29 16:00:52 crc kubenswrapper[4907]: I1129 16:00:52.575584 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db474e1-e5fb-4e82-9e09-70bcf4c64522-utilities\") pod \"community-operators-lmkq5\" (UID: \"8db474e1-e5fb-4e82-9e09-70bcf4c64522\") " pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:00:52 crc kubenswrapper[4907]: I1129 16:00:52.576827 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5fms\" (UniqueName: \"kubernetes.io/projected/8db474e1-e5fb-4e82-9e09-70bcf4c64522-kube-api-access-c5fms\") pod \"community-operators-lmkq5\" (UID: \"8db474e1-e5fb-4e82-9e09-70bcf4c64522\") " pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:00:52 crc kubenswrapper[4907]: I1129 16:00:52.577574 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db474e1-e5fb-4e82-9e09-70bcf4c64522-catalog-content\") pod \"community-operators-lmkq5\" (UID: \"8db474e1-e5fb-4e82-9e09-70bcf4c64522\") " pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:00:52 crc kubenswrapper[4907]: I1129 16:00:52.680487 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db474e1-e5fb-4e82-9e09-70bcf4c64522-catalog-content\") pod \"community-operators-lmkq5\" (UID: \"8db474e1-e5fb-4e82-9e09-70bcf4c64522\") " pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:00:52 crc kubenswrapper[4907]: I1129 16:00:52.680803 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db474e1-e5fb-4e82-9e09-70bcf4c64522-utilities\") pod \"community-operators-lmkq5\" (UID: \"8db474e1-e5fb-4e82-9e09-70bcf4c64522\") " pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:00:52 crc kubenswrapper[4907]: I1129 16:00:52.681014 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5fms\" (UniqueName: \"kubernetes.io/projected/8db474e1-e5fb-4e82-9e09-70bcf4c64522-kube-api-access-c5fms\") pod \"community-operators-lmkq5\" (UID: \"8db474e1-e5fb-4e82-9e09-70bcf4c64522\") " pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:00:52 crc kubenswrapper[4907]: I1129 16:00:52.681011 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db474e1-e5fb-4e82-9e09-70bcf4c64522-catalog-content\") pod \"community-operators-lmkq5\" (UID: \"8db474e1-e5fb-4e82-9e09-70bcf4c64522\") " pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:00:52 crc kubenswrapper[4907]: I1129 16:00:52.682083 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db474e1-e5fb-4e82-9e09-70bcf4c64522-utilities\") pod \"community-operators-lmkq5\" (UID: \"8db474e1-e5fb-4e82-9e09-70bcf4c64522\") " pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:00:52 crc kubenswrapper[4907]: I1129 16:00:52.702814 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5fms\" (UniqueName: \"kubernetes.io/projected/8db474e1-e5fb-4e82-9e09-70bcf4c64522-kube-api-access-c5fms\") pod \"community-operators-lmkq5\" (UID: \"8db474e1-e5fb-4e82-9e09-70bcf4c64522\") " pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:00:52 crc kubenswrapper[4907]: I1129 16:00:52.781805 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:00:53 crc kubenswrapper[4907]: I1129 16:00:53.323564 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lmkq5"] Nov 29 16:00:54 crc kubenswrapper[4907]: I1129 16:00:54.061764 4907 generic.go:334] "Generic (PLEG): container finished" podID="8db474e1-e5fb-4e82-9e09-70bcf4c64522" containerID="96c4730f7225b4f36fef35812a835a531b0a62151754d73e9d3124862e099cba" exitCode=0 Nov 29 16:00:54 crc kubenswrapper[4907]: I1129 16:00:54.061811 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmkq5" event={"ID":"8db474e1-e5fb-4e82-9e09-70bcf4c64522","Type":"ContainerDied","Data":"96c4730f7225b4f36fef35812a835a531b0a62151754d73e9d3124862e099cba"} Nov 29 16:00:54 crc kubenswrapper[4907]: I1129 16:00:54.062114 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmkq5" event={"ID":"8db474e1-e5fb-4e82-9e09-70bcf4c64522","Type":"ContainerStarted","Data":"11ff81336eb995b55719f2a14b88f8f31ea03eaf9d9e4e51f719cb3374292004"} Nov 29 16:00:54 crc kubenswrapper[4907]: I1129 16:00:54.067760 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 16:00:56 crc kubenswrapper[4907]: I1129 16:00:56.085357 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmkq5" event={"ID":"8db474e1-e5fb-4e82-9e09-70bcf4c64522","Type":"ContainerStarted","Data":"ac3d6ef9cf1f6bb60eb0b032ded67c49dc10ac5057abe363ac30380854f90be6"} Nov 29 16:00:57 crc kubenswrapper[4907]: I1129 16:00:57.104951 4907 generic.go:334] "Generic (PLEG): container finished" podID="8db474e1-e5fb-4e82-9e09-70bcf4c64522" containerID="ac3d6ef9cf1f6bb60eb0b032ded67c49dc10ac5057abe363ac30380854f90be6" exitCode=0 Nov 29 16:00:57 crc kubenswrapper[4907]: I1129 16:00:57.105070 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmkq5" event={"ID":"8db474e1-e5fb-4e82-9e09-70bcf4c64522","Type":"ContainerDied","Data":"ac3d6ef9cf1f6bb60eb0b032ded67c49dc10ac5057abe363ac30380854f90be6"} Nov 29 16:00:58 crc kubenswrapper[4907]: I1129 16:00:58.490103 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 16:00:58 crc kubenswrapper[4907]: I1129 16:00:58.491208 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 16:00:58 crc kubenswrapper[4907]: I1129 16:00:58.495678 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 16:00:58 crc kubenswrapper[4907]: I1129 16:00:58.496579 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45f44dad29ce701e8fc1cdcdaedb9a8d53599d4534d0ecec8bc3e9265b44b92a"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 16:00:58 crc kubenswrapper[4907]: I1129 16:00:58.496888 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://45f44dad29ce701e8fc1cdcdaedb9a8d53599d4534d0ecec8bc3e9265b44b92a" gracePeriod=600 Nov 29 16:00:59 crc kubenswrapper[4907]: I1129 16:00:59.132615 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="45f44dad29ce701e8fc1cdcdaedb9a8d53599d4534d0ecec8bc3e9265b44b92a" exitCode=0 Nov 29 16:00:59 crc kubenswrapper[4907]: I1129 16:00:59.132700 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"45f44dad29ce701e8fc1cdcdaedb9a8d53599d4534d0ecec8bc3e9265b44b92a"} Nov 29 16:00:59 crc kubenswrapper[4907]: I1129 16:00:59.133045 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e"} Nov 29 16:00:59 crc kubenswrapper[4907]: I1129 16:00:59.133093 4907 scope.go:117] "RemoveContainer" containerID="fae166eb37208ae7f2124c9c6acb600acc0ab7364de7f5736a2708871574e317" Nov 29 16:00:59 crc kubenswrapper[4907]: I1129 16:00:59.137345 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmkq5" event={"ID":"8db474e1-e5fb-4e82-9e09-70bcf4c64522","Type":"ContainerStarted","Data":"95c01507a340b051afcda19619bad0be691ceb09fee4c8af6a7d8bc35547cc04"} Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.177196 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lmkq5" podStartSLOduration=4.157198505 podStartE2EDuration="8.176169803s" podCreationTimestamp="2025-11-29 16:00:52 +0000 UTC" firstStartedPulling="2025-11-29 16:00:54.066512854 +0000 UTC m=+5552.053350506" lastFinishedPulling="2025-11-29 16:00:58.085484152 +0000 UTC m=+5556.072321804" observedRunningTime="2025-11-29 16:00:59.173923614 +0000 UTC m=+5557.160761266" watchObservedRunningTime="2025-11-29 16:01:00.176169803 +0000 UTC m=+5558.163007475" Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.216216 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29407201-vr9k7"] Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.227876 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.245169 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29407201-vr9k7"] Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.281472 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-config-data\") pod \"keystone-cron-29407201-vr9k7\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.281559 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47mg2\" (UniqueName: \"kubernetes.io/projected/23f66094-97ee-4ca1-9f7c-d435aabea4af-kube-api-access-47mg2\") pod \"keystone-cron-29407201-vr9k7\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.281597 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-combined-ca-bundle\") pod \"keystone-cron-29407201-vr9k7\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.281661 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-fernet-keys\") pod \"keystone-cron-29407201-vr9k7\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.384389 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-config-data\") pod \"keystone-cron-29407201-vr9k7\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.384495 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47mg2\" (UniqueName: \"kubernetes.io/projected/23f66094-97ee-4ca1-9f7c-d435aabea4af-kube-api-access-47mg2\") pod \"keystone-cron-29407201-vr9k7\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.384535 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-combined-ca-bundle\") pod \"keystone-cron-29407201-vr9k7\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.384606 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-fernet-keys\") pod \"keystone-cron-29407201-vr9k7\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.390920 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-combined-ca-bundle\") pod \"keystone-cron-29407201-vr9k7\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.393291 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-config-data\") pod \"keystone-cron-29407201-vr9k7\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.395739 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-fernet-keys\") pod \"keystone-cron-29407201-vr9k7\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.401115 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47mg2\" (UniqueName: \"kubernetes.io/projected/23f66094-97ee-4ca1-9f7c-d435aabea4af-kube-api-access-47mg2\") pod \"keystone-cron-29407201-vr9k7\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:00 crc kubenswrapper[4907]: I1129 16:01:00.552828 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:01 crc kubenswrapper[4907]: I1129 16:01:01.150261 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29407201-vr9k7"] Nov 29 16:01:01 crc kubenswrapper[4907]: I1129 16:01:01.202784 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29407201-vr9k7" event={"ID":"23f66094-97ee-4ca1-9f7c-d435aabea4af","Type":"ContainerStarted","Data":"707c1dc5a68272de52636fa964df0da5cffbcef69e468edeb38830f008108a1e"} Nov 29 16:01:02 crc kubenswrapper[4907]: I1129 16:01:02.216846 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29407201-vr9k7" event={"ID":"23f66094-97ee-4ca1-9f7c-d435aabea4af","Type":"ContainerStarted","Data":"8de2aa471b5d43f87a33a78c86a18ac80ee3dd9cce0e1118b8e172660a819775"} Nov 29 16:01:02 crc kubenswrapper[4907]: I1129 16:01:02.235009 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29407201-vr9k7" podStartSLOduration=2.234993904 podStartE2EDuration="2.234993904s" podCreationTimestamp="2025-11-29 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 16:01:02.230709843 +0000 UTC m=+5560.217547505" watchObservedRunningTime="2025-11-29 16:01:02.234993904 +0000 UTC m=+5560.221831556" Nov 29 16:01:02 crc kubenswrapper[4907]: I1129 16:01:02.782008 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:01:02 crc kubenswrapper[4907]: I1129 16:01:02.782050 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:01:04 crc kubenswrapper[4907]: I1129 16:01:04.036418 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-lmkq5" podUID="8db474e1-e5fb-4e82-9e09-70bcf4c64522" containerName="registry-server" probeResult="failure" output=< Nov 29 16:01:04 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 16:01:04 crc kubenswrapper[4907]: > Nov 29 16:01:04 crc kubenswrapper[4907]: I1129 16:01:04.241247 4907 generic.go:334] "Generic (PLEG): container finished" podID="23f66094-97ee-4ca1-9f7c-d435aabea4af" containerID="8de2aa471b5d43f87a33a78c86a18ac80ee3dd9cce0e1118b8e172660a819775" exitCode=0 Nov 29 16:01:04 crc kubenswrapper[4907]: I1129 16:01:04.241309 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29407201-vr9k7" event={"ID":"23f66094-97ee-4ca1-9f7c-d435aabea4af","Type":"ContainerDied","Data":"8de2aa471b5d43f87a33a78c86a18ac80ee3dd9cce0e1118b8e172660a819775"} Nov 29 16:01:05 crc kubenswrapper[4907]: I1129 16:01:05.694487 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:05 crc kubenswrapper[4907]: I1129 16:01:05.811703 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-fernet-keys\") pod \"23f66094-97ee-4ca1-9f7c-d435aabea4af\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " Nov 29 16:01:05 crc kubenswrapper[4907]: I1129 16:01:05.811815 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-config-data\") pod \"23f66094-97ee-4ca1-9f7c-d435aabea4af\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " Nov 29 16:01:05 crc kubenswrapper[4907]: I1129 16:01:05.811913 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-combined-ca-bundle\") pod \"23f66094-97ee-4ca1-9f7c-d435aabea4af\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " Nov 29 16:01:05 crc kubenswrapper[4907]: I1129 16:01:05.812092 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47mg2\" (UniqueName: \"kubernetes.io/projected/23f66094-97ee-4ca1-9f7c-d435aabea4af-kube-api-access-47mg2\") pod \"23f66094-97ee-4ca1-9f7c-d435aabea4af\" (UID: \"23f66094-97ee-4ca1-9f7c-d435aabea4af\") " Nov 29 16:01:05 crc kubenswrapper[4907]: I1129 16:01:05.824228 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23f66094-97ee-4ca1-9f7c-d435aabea4af-kube-api-access-47mg2" (OuterVolumeSpecName: "kube-api-access-47mg2") pod "23f66094-97ee-4ca1-9f7c-d435aabea4af" (UID: "23f66094-97ee-4ca1-9f7c-d435aabea4af"). InnerVolumeSpecName "kube-api-access-47mg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:01:05 crc kubenswrapper[4907]: I1129 16:01:05.826913 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "23f66094-97ee-4ca1-9f7c-d435aabea4af" (UID: "23f66094-97ee-4ca1-9f7c-d435aabea4af"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 16:01:05 crc kubenswrapper[4907]: I1129 16:01:05.872649 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23f66094-97ee-4ca1-9f7c-d435aabea4af" (UID: "23f66094-97ee-4ca1-9f7c-d435aabea4af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 16:01:05 crc kubenswrapper[4907]: I1129 16:01:05.922546 4907 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 29 16:01:05 crc kubenswrapper[4907]: I1129 16:01:05.922580 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 16:01:05 crc kubenswrapper[4907]: I1129 16:01:05.922593 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47mg2\" (UniqueName: \"kubernetes.io/projected/23f66094-97ee-4ca1-9f7c-d435aabea4af-kube-api-access-47mg2\") on node \"crc\" DevicePath \"\"" Nov 29 16:01:05 crc kubenswrapper[4907]: I1129 16:01:05.967595 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-config-data" (OuterVolumeSpecName: "config-data") pod "23f66094-97ee-4ca1-9f7c-d435aabea4af" (UID: "23f66094-97ee-4ca1-9f7c-d435aabea4af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 16:01:06 crc kubenswrapper[4907]: I1129 16:01:06.024969 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f66094-97ee-4ca1-9f7c-d435aabea4af-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 16:01:06 crc kubenswrapper[4907]: I1129 16:01:06.271283 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29407201-vr9k7" event={"ID":"23f66094-97ee-4ca1-9f7c-d435aabea4af","Type":"ContainerDied","Data":"707c1dc5a68272de52636fa964df0da5cffbcef69e468edeb38830f008108a1e"} Nov 29 16:01:06 crc kubenswrapper[4907]: I1129 16:01:06.271336 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="707c1dc5a68272de52636fa964df0da5cffbcef69e468edeb38830f008108a1e" Nov 29 16:01:06 crc kubenswrapper[4907]: I1129 16:01:06.271411 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29407201-vr9k7" Nov 29 16:01:12 crc kubenswrapper[4907]: I1129 16:01:12.879183 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:01:12 crc kubenswrapper[4907]: I1129 16:01:12.937511 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:01:13 crc kubenswrapper[4907]: I1129 16:01:13.133980 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lmkq5"] Nov 29 16:01:14 crc kubenswrapper[4907]: I1129 16:01:14.420011 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lmkq5" podUID="8db474e1-e5fb-4e82-9e09-70bcf4c64522" containerName="registry-server" containerID="cri-o://95c01507a340b051afcda19619bad0be691ceb09fee4c8af6a7d8bc35547cc04" gracePeriod=2 Nov 29 16:01:14 crc kubenswrapper[4907]: I1129 16:01:14.994300 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.160821 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db474e1-e5fb-4e82-9e09-70bcf4c64522-catalog-content\") pod \"8db474e1-e5fb-4e82-9e09-70bcf4c64522\" (UID: \"8db474e1-e5fb-4e82-9e09-70bcf4c64522\") " Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.161229 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5fms\" (UniqueName: \"kubernetes.io/projected/8db474e1-e5fb-4e82-9e09-70bcf4c64522-kube-api-access-c5fms\") pod \"8db474e1-e5fb-4e82-9e09-70bcf4c64522\" (UID: \"8db474e1-e5fb-4e82-9e09-70bcf4c64522\") " Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.161323 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db474e1-e5fb-4e82-9e09-70bcf4c64522-utilities\") pod \"8db474e1-e5fb-4e82-9e09-70bcf4c64522\" (UID: \"8db474e1-e5fb-4e82-9e09-70bcf4c64522\") " Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.162117 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db474e1-e5fb-4e82-9e09-70bcf4c64522-utilities" (OuterVolumeSpecName: "utilities") pod "8db474e1-e5fb-4e82-9e09-70bcf4c64522" (UID: "8db474e1-e5fb-4e82-9e09-70bcf4c64522"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.162422 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db474e1-e5fb-4e82-9e09-70bcf4c64522-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.171046 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db474e1-e5fb-4e82-9e09-70bcf4c64522-kube-api-access-c5fms" (OuterVolumeSpecName: "kube-api-access-c5fms") pod "8db474e1-e5fb-4e82-9e09-70bcf4c64522" (UID: "8db474e1-e5fb-4e82-9e09-70bcf4c64522"). InnerVolumeSpecName "kube-api-access-c5fms". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.233581 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db474e1-e5fb-4e82-9e09-70bcf4c64522-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8db474e1-e5fb-4e82-9e09-70bcf4c64522" (UID: "8db474e1-e5fb-4e82-9e09-70bcf4c64522"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.264224 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db474e1-e5fb-4e82-9e09-70bcf4c64522-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.264250 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5fms\" (UniqueName: \"kubernetes.io/projected/8db474e1-e5fb-4e82-9e09-70bcf4c64522-kube-api-access-c5fms\") on node \"crc\" DevicePath \"\"" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.432897 4907 generic.go:334] "Generic (PLEG): container finished" podID="8db474e1-e5fb-4e82-9e09-70bcf4c64522" containerID="95c01507a340b051afcda19619bad0be691ceb09fee4c8af6a7d8bc35547cc04" exitCode=0 Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.432965 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmkq5" event={"ID":"8db474e1-e5fb-4e82-9e09-70bcf4c64522","Type":"ContainerDied","Data":"95c01507a340b051afcda19619bad0be691ceb09fee4c8af6a7d8bc35547cc04"} Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.432995 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lmkq5" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.433037 4907 scope.go:117] "RemoveContainer" containerID="95c01507a340b051afcda19619bad0be691ceb09fee4c8af6a7d8bc35547cc04" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.433018 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmkq5" event={"ID":"8db474e1-e5fb-4e82-9e09-70bcf4c64522","Type":"ContainerDied","Data":"11ff81336eb995b55719f2a14b88f8f31ea03eaf9d9e4e51f719cb3374292004"} Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.453680 4907 scope.go:117] "RemoveContainer" containerID="ac3d6ef9cf1f6bb60eb0b032ded67c49dc10ac5057abe363ac30380854f90be6" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.482696 4907 scope.go:117] "RemoveContainer" containerID="96c4730f7225b4f36fef35812a835a531b0a62151754d73e9d3124862e099cba" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.495077 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lmkq5"] Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.513625 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lmkq5"] Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.532886 4907 scope.go:117] "RemoveContainer" containerID="95c01507a340b051afcda19619bad0be691ceb09fee4c8af6a7d8bc35547cc04" Nov 29 16:01:15 crc kubenswrapper[4907]: E1129 16:01:15.534158 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c01507a340b051afcda19619bad0be691ceb09fee4c8af6a7d8bc35547cc04\": container with ID starting with 95c01507a340b051afcda19619bad0be691ceb09fee4c8af6a7d8bc35547cc04 not found: ID does not exist" containerID="95c01507a340b051afcda19619bad0be691ceb09fee4c8af6a7d8bc35547cc04" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.534210 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c01507a340b051afcda19619bad0be691ceb09fee4c8af6a7d8bc35547cc04"} err="failed to get container status \"95c01507a340b051afcda19619bad0be691ceb09fee4c8af6a7d8bc35547cc04\": rpc error: code = NotFound desc = could not find container \"95c01507a340b051afcda19619bad0be691ceb09fee4c8af6a7d8bc35547cc04\": container with ID starting with 95c01507a340b051afcda19619bad0be691ceb09fee4c8af6a7d8bc35547cc04 not found: ID does not exist" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.534252 4907 scope.go:117] "RemoveContainer" containerID="ac3d6ef9cf1f6bb60eb0b032ded67c49dc10ac5057abe363ac30380854f90be6" Nov 29 16:01:15 crc kubenswrapper[4907]: E1129 16:01:15.534632 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac3d6ef9cf1f6bb60eb0b032ded67c49dc10ac5057abe363ac30380854f90be6\": container with ID starting with ac3d6ef9cf1f6bb60eb0b032ded67c49dc10ac5057abe363ac30380854f90be6 not found: ID does not exist" containerID="ac3d6ef9cf1f6bb60eb0b032ded67c49dc10ac5057abe363ac30380854f90be6" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.534717 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac3d6ef9cf1f6bb60eb0b032ded67c49dc10ac5057abe363ac30380854f90be6"} err="failed to get container status \"ac3d6ef9cf1f6bb60eb0b032ded67c49dc10ac5057abe363ac30380854f90be6\": rpc error: code = NotFound desc = could not find container \"ac3d6ef9cf1f6bb60eb0b032ded67c49dc10ac5057abe363ac30380854f90be6\": container with ID starting with ac3d6ef9cf1f6bb60eb0b032ded67c49dc10ac5057abe363ac30380854f90be6 not found: ID does not exist" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.534742 4907 scope.go:117] "RemoveContainer" containerID="96c4730f7225b4f36fef35812a835a531b0a62151754d73e9d3124862e099cba" Nov 29 16:01:15 crc kubenswrapper[4907]: E1129 16:01:15.535109 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c4730f7225b4f36fef35812a835a531b0a62151754d73e9d3124862e099cba\": container with ID starting with 96c4730f7225b4f36fef35812a835a531b0a62151754d73e9d3124862e099cba not found: ID does not exist" containerID="96c4730f7225b4f36fef35812a835a531b0a62151754d73e9d3124862e099cba" Nov 29 16:01:15 crc kubenswrapper[4907]: I1129 16:01:15.535182 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c4730f7225b4f36fef35812a835a531b0a62151754d73e9d3124862e099cba"} err="failed to get container status \"96c4730f7225b4f36fef35812a835a531b0a62151754d73e9d3124862e099cba\": rpc error: code = NotFound desc = could not find container \"96c4730f7225b4f36fef35812a835a531b0a62151754d73e9d3124862e099cba\": container with ID starting with 96c4730f7225b4f36fef35812a835a531b0a62151754d73e9d3124862e099cba not found: ID does not exist" Nov 29 16:01:16 crc kubenswrapper[4907]: I1129 16:01:16.498101 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db474e1-e5fb-4e82-9e09-70bcf4c64522" path="/var/lib/kubelet/pods/8db474e1-e5fb-4e82-9e09-70bcf4c64522/volumes" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.325753 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4snzw"] Nov 29 16:02:02 crc kubenswrapper[4907]: E1129 16:02:02.327353 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f66094-97ee-4ca1-9f7c-d435aabea4af" containerName="keystone-cron" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.327379 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f66094-97ee-4ca1-9f7c-d435aabea4af" containerName="keystone-cron" Nov 29 16:02:02 crc kubenswrapper[4907]: E1129 16:02:02.327412 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db474e1-e5fb-4e82-9e09-70bcf4c64522" containerName="extract-content" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.327425 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db474e1-e5fb-4e82-9e09-70bcf4c64522" containerName="extract-content" Nov 29 16:02:02 crc kubenswrapper[4907]: E1129 16:02:02.327477 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db474e1-e5fb-4e82-9e09-70bcf4c64522" containerName="extract-utilities" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.327491 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db474e1-e5fb-4e82-9e09-70bcf4c64522" containerName="extract-utilities" Nov 29 16:02:02 crc kubenswrapper[4907]: E1129 16:02:02.327519 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db474e1-e5fb-4e82-9e09-70bcf4c64522" containerName="registry-server" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.327531 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db474e1-e5fb-4e82-9e09-70bcf4c64522" containerName="registry-server" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.327999 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="23f66094-97ee-4ca1-9f7c-d435aabea4af" containerName="keystone-cron" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.328050 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db474e1-e5fb-4e82-9e09-70bcf4c64522" containerName="registry-server" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.331390 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.343763 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4snzw"] Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.473995 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntpfb\" (UniqueName: \"kubernetes.io/projected/473de773-5172-42d9-bb55-98d61a2e43d6-kube-api-access-ntpfb\") pod \"redhat-marketplace-4snzw\" (UID: \"473de773-5172-42d9-bb55-98d61a2e43d6\") " pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.474133 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/473de773-5172-42d9-bb55-98d61a2e43d6-utilities\") pod \"redhat-marketplace-4snzw\" (UID: \"473de773-5172-42d9-bb55-98d61a2e43d6\") " pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.474969 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/473de773-5172-42d9-bb55-98d61a2e43d6-catalog-content\") pod \"redhat-marketplace-4snzw\" (UID: \"473de773-5172-42d9-bb55-98d61a2e43d6\") " pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.577845 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/473de773-5172-42d9-bb55-98d61a2e43d6-catalog-content\") pod \"redhat-marketplace-4snzw\" (UID: \"473de773-5172-42d9-bb55-98d61a2e43d6\") " pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.577995 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntpfb\" (UniqueName: \"kubernetes.io/projected/473de773-5172-42d9-bb55-98d61a2e43d6-kube-api-access-ntpfb\") pod \"redhat-marketplace-4snzw\" (UID: \"473de773-5172-42d9-bb55-98d61a2e43d6\") " pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.578118 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/473de773-5172-42d9-bb55-98d61a2e43d6-utilities\") pod \"redhat-marketplace-4snzw\" (UID: \"473de773-5172-42d9-bb55-98d61a2e43d6\") " pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.579022 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/473de773-5172-42d9-bb55-98d61a2e43d6-utilities\") pod \"redhat-marketplace-4snzw\" (UID: \"473de773-5172-42d9-bb55-98d61a2e43d6\") " pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.579044 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/473de773-5172-42d9-bb55-98d61a2e43d6-catalog-content\") pod \"redhat-marketplace-4snzw\" (UID: \"473de773-5172-42d9-bb55-98d61a2e43d6\") " pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.604548 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntpfb\" (UniqueName: \"kubernetes.io/projected/473de773-5172-42d9-bb55-98d61a2e43d6-kube-api-access-ntpfb\") pod \"redhat-marketplace-4snzw\" (UID: \"473de773-5172-42d9-bb55-98d61a2e43d6\") " pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:02 crc kubenswrapper[4907]: I1129 16:02:02.674363 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:03 crc kubenswrapper[4907]: I1129 16:02:03.128173 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4snzw"] Nov 29 16:02:03 crc kubenswrapper[4907]: I1129 16:02:03.379928 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4snzw" event={"ID":"473de773-5172-42d9-bb55-98d61a2e43d6","Type":"ContainerStarted","Data":"7e316460619076ff51efee4cb8107e9ff330dac8bafe8acc014d3cd574e19001"} Nov 29 16:02:04 crc kubenswrapper[4907]: I1129 16:02:04.394983 4907 generic.go:334] "Generic (PLEG): container finished" podID="473de773-5172-42d9-bb55-98d61a2e43d6" containerID="01114768a5808bc12c27c6d8d7b531b19cd21cee99b32302b696719411c3887b" exitCode=0 Nov 29 16:02:04 crc kubenswrapper[4907]: I1129 16:02:04.395374 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4snzw" event={"ID":"473de773-5172-42d9-bb55-98d61a2e43d6","Type":"ContainerDied","Data":"01114768a5808bc12c27c6d8d7b531b19cd21cee99b32302b696719411c3887b"} Nov 29 16:02:05 crc kubenswrapper[4907]: I1129 16:02:05.411305 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4snzw" event={"ID":"473de773-5172-42d9-bb55-98d61a2e43d6","Type":"ContainerStarted","Data":"66d7feb7053d735f223b8916b4457410e4c8f3ef40f31e97be45888526f3d05a"} Nov 29 16:02:06 crc kubenswrapper[4907]: I1129 16:02:06.426486 4907 generic.go:334] "Generic (PLEG): container finished" podID="473de773-5172-42d9-bb55-98d61a2e43d6" containerID="66d7feb7053d735f223b8916b4457410e4c8f3ef40f31e97be45888526f3d05a" exitCode=0 Nov 29 16:02:06 crc kubenswrapper[4907]: I1129 16:02:06.426572 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4snzw" event={"ID":"473de773-5172-42d9-bb55-98d61a2e43d6","Type":"ContainerDied","Data":"66d7feb7053d735f223b8916b4457410e4c8f3ef40f31e97be45888526f3d05a"} Nov 29 16:02:07 crc kubenswrapper[4907]: I1129 16:02:07.440316 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4snzw" event={"ID":"473de773-5172-42d9-bb55-98d61a2e43d6","Type":"ContainerStarted","Data":"d8b2c9d54f2db78d90a1fe7ee85db48e5f6b602b5d6f134a7d439b9aa2bcedb2"} Nov 29 16:02:07 crc kubenswrapper[4907]: I1129 16:02:07.465570 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4snzw" podStartSLOduration=3.019173394 podStartE2EDuration="5.465535601s" podCreationTimestamp="2025-11-29 16:02:02 +0000 UTC" firstStartedPulling="2025-11-29 16:02:04.40022859 +0000 UTC m=+5622.387066282" lastFinishedPulling="2025-11-29 16:02:06.846590827 +0000 UTC m=+5624.833428489" observedRunningTime="2025-11-29 16:02:07.458726059 +0000 UTC m=+5625.445563721" watchObservedRunningTime="2025-11-29 16:02:07.465535601 +0000 UTC m=+5625.452373293" Nov 29 16:02:12 crc kubenswrapper[4907]: I1129 16:02:12.675356 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:12 crc kubenswrapper[4907]: I1129 16:02:12.677583 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:12 crc kubenswrapper[4907]: I1129 16:02:12.763710 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:13 crc kubenswrapper[4907]: I1129 16:02:13.574908 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:13 crc kubenswrapper[4907]: I1129 16:02:13.641868 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4snzw"] Nov 29 16:02:15 crc kubenswrapper[4907]: I1129 16:02:15.539778 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4snzw" podUID="473de773-5172-42d9-bb55-98d61a2e43d6" containerName="registry-server" containerID="cri-o://d8b2c9d54f2db78d90a1fe7ee85db48e5f6b602b5d6f134a7d439b9aa2bcedb2" gracePeriod=2 Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.091126 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.146763 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntpfb\" (UniqueName: \"kubernetes.io/projected/473de773-5172-42d9-bb55-98d61a2e43d6-kube-api-access-ntpfb\") pod \"473de773-5172-42d9-bb55-98d61a2e43d6\" (UID: \"473de773-5172-42d9-bb55-98d61a2e43d6\") " Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.147031 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/473de773-5172-42d9-bb55-98d61a2e43d6-catalog-content\") pod \"473de773-5172-42d9-bb55-98d61a2e43d6\" (UID: \"473de773-5172-42d9-bb55-98d61a2e43d6\") " Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.147160 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/473de773-5172-42d9-bb55-98d61a2e43d6-utilities\") pod \"473de773-5172-42d9-bb55-98d61a2e43d6\" (UID: \"473de773-5172-42d9-bb55-98d61a2e43d6\") " Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.148032 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/473de773-5172-42d9-bb55-98d61a2e43d6-utilities" (OuterVolumeSpecName: "utilities") pod "473de773-5172-42d9-bb55-98d61a2e43d6" (UID: "473de773-5172-42d9-bb55-98d61a2e43d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.148181 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/473de773-5172-42d9-bb55-98d61a2e43d6-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.158792 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473de773-5172-42d9-bb55-98d61a2e43d6-kube-api-access-ntpfb" (OuterVolumeSpecName: "kube-api-access-ntpfb") pod "473de773-5172-42d9-bb55-98d61a2e43d6" (UID: "473de773-5172-42d9-bb55-98d61a2e43d6"). InnerVolumeSpecName "kube-api-access-ntpfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.169105 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/473de773-5172-42d9-bb55-98d61a2e43d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "473de773-5172-42d9-bb55-98d61a2e43d6" (UID: "473de773-5172-42d9-bb55-98d61a2e43d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.249941 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntpfb\" (UniqueName: \"kubernetes.io/projected/473de773-5172-42d9-bb55-98d61a2e43d6-kube-api-access-ntpfb\") on node \"crc\" DevicePath \"\"" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.249973 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/473de773-5172-42d9-bb55-98d61a2e43d6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.557223 4907 generic.go:334] "Generic (PLEG): container finished" podID="473de773-5172-42d9-bb55-98d61a2e43d6" containerID="d8b2c9d54f2db78d90a1fe7ee85db48e5f6b602b5d6f134a7d439b9aa2bcedb2" exitCode=0 Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.557278 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4snzw" event={"ID":"473de773-5172-42d9-bb55-98d61a2e43d6","Type":"ContainerDied","Data":"d8b2c9d54f2db78d90a1fe7ee85db48e5f6b602b5d6f134a7d439b9aa2bcedb2"} Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.557318 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4snzw" event={"ID":"473de773-5172-42d9-bb55-98d61a2e43d6","Type":"ContainerDied","Data":"7e316460619076ff51efee4cb8107e9ff330dac8bafe8acc014d3cd574e19001"} Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.557340 4907 scope.go:117] "RemoveContainer" containerID="d8b2c9d54f2db78d90a1fe7ee85db48e5f6b602b5d6f134a7d439b9aa2bcedb2" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.557526 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4snzw" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.603777 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4snzw"] Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.609303 4907 scope.go:117] "RemoveContainer" containerID="66d7feb7053d735f223b8916b4457410e4c8f3ef40f31e97be45888526f3d05a" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.643121 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4snzw"] Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.653892 4907 scope.go:117] "RemoveContainer" containerID="01114768a5808bc12c27c6d8d7b531b19cd21cee99b32302b696719411c3887b" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.717145 4907 scope.go:117] "RemoveContainer" containerID="d8b2c9d54f2db78d90a1fe7ee85db48e5f6b602b5d6f134a7d439b9aa2bcedb2" Nov 29 16:02:16 crc kubenswrapper[4907]: E1129 16:02:16.717802 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8b2c9d54f2db78d90a1fe7ee85db48e5f6b602b5d6f134a7d439b9aa2bcedb2\": container with ID starting with d8b2c9d54f2db78d90a1fe7ee85db48e5f6b602b5d6f134a7d439b9aa2bcedb2 not found: ID does not exist" containerID="d8b2c9d54f2db78d90a1fe7ee85db48e5f6b602b5d6f134a7d439b9aa2bcedb2" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.717846 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8b2c9d54f2db78d90a1fe7ee85db48e5f6b602b5d6f134a7d439b9aa2bcedb2"} err="failed to get container status \"d8b2c9d54f2db78d90a1fe7ee85db48e5f6b602b5d6f134a7d439b9aa2bcedb2\": rpc error: code = NotFound desc = could not find container \"d8b2c9d54f2db78d90a1fe7ee85db48e5f6b602b5d6f134a7d439b9aa2bcedb2\": container with ID starting with d8b2c9d54f2db78d90a1fe7ee85db48e5f6b602b5d6f134a7d439b9aa2bcedb2 not found: ID does not exist" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.717871 4907 scope.go:117] "RemoveContainer" containerID="66d7feb7053d735f223b8916b4457410e4c8f3ef40f31e97be45888526f3d05a" Nov 29 16:02:16 crc kubenswrapper[4907]: E1129 16:02:16.718567 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66d7feb7053d735f223b8916b4457410e4c8f3ef40f31e97be45888526f3d05a\": container with ID starting with 66d7feb7053d735f223b8916b4457410e4c8f3ef40f31e97be45888526f3d05a not found: ID does not exist" containerID="66d7feb7053d735f223b8916b4457410e4c8f3ef40f31e97be45888526f3d05a" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.718620 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d7feb7053d735f223b8916b4457410e4c8f3ef40f31e97be45888526f3d05a"} err="failed to get container status \"66d7feb7053d735f223b8916b4457410e4c8f3ef40f31e97be45888526f3d05a\": rpc error: code = NotFound desc = could not find container \"66d7feb7053d735f223b8916b4457410e4c8f3ef40f31e97be45888526f3d05a\": container with ID starting with 66d7feb7053d735f223b8916b4457410e4c8f3ef40f31e97be45888526f3d05a not found: ID does not exist" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.718654 4907 scope.go:117] "RemoveContainer" containerID="01114768a5808bc12c27c6d8d7b531b19cd21cee99b32302b696719411c3887b" Nov 29 16:02:16 crc kubenswrapper[4907]: E1129 16:02:16.719901 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01114768a5808bc12c27c6d8d7b531b19cd21cee99b32302b696719411c3887b\": container with ID starting with 01114768a5808bc12c27c6d8d7b531b19cd21cee99b32302b696719411c3887b not found: ID does not exist" containerID="01114768a5808bc12c27c6d8d7b531b19cd21cee99b32302b696719411c3887b" Nov 29 16:02:16 crc kubenswrapper[4907]: I1129 16:02:16.719931 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01114768a5808bc12c27c6d8d7b531b19cd21cee99b32302b696719411c3887b"} err="failed to get container status \"01114768a5808bc12c27c6d8d7b531b19cd21cee99b32302b696719411c3887b\": rpc error: code = NotFound desc = could not find container \"01114768a5808bc12c27c6d8d7b531b19cd21cee99b32302b696719411c3887b\": container with ID starting with 01114768a5808bc12c27c6d8d7b531b19cd21cee99b32302b696719411c3887b not found: ID does not exist" Nov 29 16:02:18 crc kubenswrapper[4907]: I1129 16:02:18.493757 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="473de773-5172-42d9-bb55-98d61a2e43d6" path="/var/lib/kubelet/pods/473de773-5172-42d9-bb55-98d61a2e43d6/volumes" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.105235 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xn79x"] Nov 29 16:02:46 crc kubenswrapper[4907]: E1129 16:02:46.109251 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473de773-5172-42d9-bb55-98d61a2e43d6" containerName="extract-utilities" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.109368 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="473de773-5172-42d9-bb55-98d61a2e43d6" containerName="extract-utilities" Nov 29 16:02:46 crc kubenswrapper[4907]: E1129 16:02:46.109467 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473de773-5172-42d9-bb55-98d61a2e43d6" containerName="registry-server" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.109538 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="473de773-5172-42d9-bb55-98d61a2e43d6" containerName="registry-server" Nov 29 16:02:46 crc kubenswrapper[4907]: E1129 16:02:46.109614 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473de773-5172-42d9-bb55-98d61a2e43d6" containerName="extract-content" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.109672 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="473de773-5172-42d9-bb55-98d61a2e43d6" containerName="extract-content" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.110003 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="473de773-5172-42d9-bb55-98d61a2e43d6" containerName="registry-server" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.112205 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.122143 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xn79x"] Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.176322 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b026e40c-0a1c-4868-96b6-a4eca5c227a0-utilities\") pod \"certified-operators-xn79x\" (UID: \"b026e40c-0a1c-4868-96b6-a4eca5c227a0\") " pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.176406 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b026e40c-0a1c-4868-96b6-a4eca5c227a0-catalog-content\") pod \"certified-operators-xn79x\" (UID: \"b026e40c-0a1c-4868-96b6-a4eca5c227a0\") " pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.176497 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb7f9\" (UniqueName: \"kubernetes.io/projected/b026e40c-0a1c-4868-96b6-a4eca5c227a0-kube-api-access-fb7f9\") pod \"certified-operators-xn79x\" (UID: \"b026e40c-0a1c-4868-96b6-a4eca5c227a0\") " pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.277974 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b026e40c-0a1c-4868-96b6-a4eca5c227a0-utilities\") pod \"certified-operators-xn79x\" (UID: \"b026e40c-0a1c-4868-96b6-a4eca5c227a0\") " pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.278079 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b026e40c-0a1c-4868-96b6-a4eca5c227a0-catalog-content\") pod \"certified-operators-xn79x\" (UID: \"b026e40c-0a1c-4868-96b6-a4eca5c227a0\") " pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.278157 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb7f9\" (UniqueName: \"kubernetes.io/projected/b026e40c-0a1c-4868-96b6-a4eca5c227a0-kube-api-access-fb7f9\") pod \"certified-operators-xn79x\" (UID: \"b026e40c-0a1c-4868-96b6-a4eca5c227a0\") " pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.279033 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b026e40c-0a1c-4868-96b6-a4eca5c227a0-utilities\") pod \"certified-operators-xn79x\" (UID: \"b026e40c-0a1c-4868-96b6-a4eca5c227a0\") " pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.279127 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b026e40c-0a1c-4868-96b6-a4eca5c227a0-catalog-content\") pod \"certified-operators-xn79x\" (UID: \"b026e40c-0a1c-4868-96b6-a4eca5c227a0\") " pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.304573 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb7f9\" (UniqueName: \"kubernetes.io/projected/b026e40c-0a1c-4868-96b6-a4eca5c227a0-kube-api-access-fb7f9\") pod \"certified-operators-xn79x\" (UID: \"b026e40c-0a1c-4868-96b6-a4eca5c227a0\") " pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:46 crc kubenswrapper[4907]: I1129 16:02:46.438000 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:47 crc kubenswrapper[4907]: I1129 16:02:47.011205 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xn79x"] Nov 29 16:02:47 crc kubenswrapper[4907]: I1129 16:02:47.957214 4907 generic.go:334] "Generic (PLEG): container finished" podID="b026e40c-0a1c-4868-96b6-a4eca5c227a0" containerID="69b370615064edfef06b4a1d6828eca270e5db34245eea3dbf6b3abee10b60ae" exitCode=0 Nov 29 16:02:47 crc kubenswrapper[4907]: I1129 16:02:47.957382 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xn79x" event={"ID":"b026e40c-0a1c-4868-96b6-a4eca5c227a0","Type":"ContainerDied","Data":"69b370615064edfef06b4a1d6828eca270e5db34245eea3dbf6b3abee10b60ae"} Nov 29 16:02:47 crc kubenswrapper[4907]: I1129 16:02:47.957573 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xn79x" event={"ID":"b026e40c-0a1c-4868-96b6-a4eca5c227a0","Type":"ContainerStarted","Data":"8352424f954a578880e41925cf6c15a7963c34a35f76c7cf1fee2794236935cd"} Nov 29 16:02:49 crc kubenswrapper[4907]: I1129 16:02:49.995965 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xn79x" event={"ID":"b026e40c-0a1c-4868-96b6-a4eca5c227a0","Type":"ContainerStarted","Data":"a9374ba24f9aec4d57f050d28efb2010097fefa09d8c97f3921e5f7292cb4967"} Nov 29 16:02:51 crc kubenswrapper[4907]: I1129 16:02:51.008394 4907 generic.go:334] "Generic (PLEG): container finished" podID="b026e40c-0a1c-4868-96b6-a4eca5c227a0" containerID="a9374ba24f9aec4d57f050d28efb2010097fefa09d8c97f3921e5f7292cb4967" exitCode=0 Nov 29 16:02:51 crc kubenswrapper[4907]: I1129 16:02:51.008463 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xn79x" event={"ID":"b026e40c-0a1c-4868-96b6-a4eca5c227a0","Type":"ContainerDied","Data":"a9374ba24f9aec4d57f050d28efb2010097fefa09d8c97f3921e5f7292cb4967"} Nov 29 16:02:52 crc kubenswrapper[4907]: I1129 16:02:52.029341 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xn79x" event={"ID":"b026e40c-0a1c-4868-96b6-a4eca5c227a0","Type":"ContainerStarted","Data":"228a5741a3324e80d530fa1962d199f516be8b7865898bde4f2bc1186b7e9279"} Nov 29 16:02:52 crc kubenswrapper[4907]: I1129 16:02:52.056214 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xn79x" podStartSLOduration=2.43554046 podStartE2EDuration="6.056185781s" podCreationTimestamp="2025-11-29 16:02:46 +0000 UTC" firstStartedPulling="2025-11-29 16:02:47.96061804 +0000 UTC m=+5665.947455692" lastFinishedPulling="2025-11-29 16:02:51.581263361 +0000 UTC m=+5669.568101013" observedRunningTime="2025-11-29 16:02:52.051763977 +0000 UTC m=+5670.038601669" watchObservedRunningTime="2025-11-29 16:02:52.056185781 +0000 UTC m=+5670.043023443" Nov 29 16:02:56 crc kubenswrapper[4907]: I1129 16:02:56.438722 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:56 crc kubenswrapper[4907]: I1129 16:02:56.440634 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:56 crc kubenswrapper[4907]: I1129 16:02:56.527808 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:57 crc kubenswrapper[4907]: I1129 16:02:57.165283 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:57 crc kubenswrapper[4907]: I1129 16:02:57.243350 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xn79x"] Nov 29 16:02:59 crc kubenswrapper[4907]: I1129 16:02:59.134658 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xn79x" podUID="b026e40c-0a1c-4868-96b6-a4eca5c227a0" containerName="registry-server" containerID="cri-o://228a5741a3324e80d530fa1962d199f516be8b7865898bde4f2bc1186b7e9279" gracePeriod=2 Nov 29 16:02:59 crc kubenswrapper[4907]: I1129 16:02:59.792850 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:02:59 crc kubenswrapper[4907]: I1129 16:02:59.928180 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b026e40c-0a1c-4868-96b6-a4eca5c227a0-utilities\") pod \"b026e40c-0a1c-4868-96b6-a4eca5c227a0\" (UID: \"b026e40c-0a1c-4868-96b6-a4eca5c227a0\") " Nov 29 16:02:59 crc kubenswrapper[4907]: I1129 16:02:59.928619 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb7f9\" (UniqueName: \"kubernetes.io/projected/b026e40c-0a1c-4868-96b6-a4eca5c227a0-kube-api-access-fb7f9\") pod \"b026e40c-0a1c-4868-96b6-a4eca5c227a0\" (UID: \"b026e40c-0a1c-4868-96b6-a4eca5c227a0\") " Nov 29 16:02:59 crc kubenswrapper[4907]: I1129 16:02:59.928739 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b026e40c-0a1c-4868-96b6-a4eca5c227a0-utilities" (OuterVolumeSpecName: "utilities") pod "b026e40c-0a1c-4868-96b6-a4eca5c227a0" (UID: "b026e40c-0a1c-4868-96b6-a4eca5c227a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:02:59 crc kubenswrapper[4907]: I1129 16:02:59.929186 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b026e40c-0a1c-4868-96b6-a4eca5c227a0-catalog-content\") pod \"b026e40c-0a1c-4868-96b6-a4eca5c227a0\" (UID: \"b026e40c-0a1c-4868-96b6-a4eca5c227a0\") " Nov 29 16:02:59 crc kubenswrapper[4907]: I1129 16:02:59.930641 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b026e40c-0a1c-4868-96b6-a4eca5c227a0-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 16:02:59 crc kubenswrapper[4907]: I1129 16:02:59.936203 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b026e40c-0a1c-4868-96b6-a4eca5c227a0-kube-api-access-fb7f9" (OuterVolumeSpecName: "kube-api-access-fb7f9") pod "b026e40c-0a1c-4868-96b6-a4eca5c227a0" (UID: "b026e40c-0a1c-4868-96b6-a4eca5c227a0"). InnerVolumeSpecName "kube-api-access-fb7f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:02:59 crc kubenswrapper[4907]: I1129 16:02:59.982497 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b026e40c-0a1c-4868-96b6-a4eca5c227a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b026e40c-0a1c-4868-96b6-a4eca5c227a0" (UID: "b026e40c-0a1c-4868-96b6-a4eca5c227a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.032871 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb7f9\" (UniqueName: \"kubernetes.io/projected/b026e40c-0a1c-4868-96b6-a4eca5c227a0-kube-api-access-fb7f9\") on node \"crc\" DevicePath \"\"" Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.032913 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b026e40c-0a1c-4868-96b6-a4eca5c227a0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.153215 4907 generic.go:334] "Generic (PLEG): container finished" podID="b026e40c-0a1c-4868-96b6-a4eca5c227a0" containerID="228a5741a3324e80d530fa1962d199f516be8b7865898bde4f2bc1186b7e9279" exitCode=0 Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.153298 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xn79x" Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.153322 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xn79x" event={"ID":"b026e40c-0a1c-4868-96b6-a4eca5c227a0","Type":"ContainerDied","Data":"228a5741a3324e80d530fa1962d199f516be8b7865898bde4f2bc1186b7e9279"} Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.154938 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xn79x" event={"ID":"b026e40c-0a1c-4868-96b6-a4eca5c227a0","Type":"ContainerDied","Data":"8352424f954a578880e41925cf6c15a7963c34a35f76c7cf1fee2794236935cd"} Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.154965 4907 scope.go:117] "RemoveContainer" containerID="228a5741a3324e80d530fa1962d199f516be8b7865898bde4f2bc1186b7e9279" Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.185889 4907 scope.go:117] "RemoveContainer" containerID="a9374ba24f9aec4d57f050d28efb2010097fefa09d8c97f3921e5f7292cb4967" Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.238930 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xn79x"] Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.242760 4907 scope.go:117] "RemoveContainer" containerID="69b370615064edfef06b4a1d6828eca270e5db34245eea3dbf6b3abee10b60ae" Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.254355 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xn79x"] Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.294696 4907 scope.go:117] "RemoveContainer" containerID="228a5741a3324e80d530fa1962d199f516be8b7865898bde4f2bc1186b7e9279" Nov 29 16:03:00 crc kubenswrapper[4907]: E1129 16:03:00.295550 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"228a5741a3324e80d530fa1962d199f516be8b7865898bde4f2bc1186b7e9279\": container with ID starting with 228a5741a3324e80d530fa1962d199f516be8b7865898bde4f2bc1186b7e9279 not found: ID does not exist" containerID="228a5741a3324e80d530fa1962d199f516be8b7865898bde4f2bc1186b7e9279" Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.295610 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"228a5741a3324e80d530fa1962d199f516be8b7865898bde4f2bc1186b7e9279"} err="failed to get container status \"228a5741a3324e80d530fa1962d199f516be8b7865898bde4f2bc1186b7e9279\": rpc error: code = NotFound desc = could not find container \"228a5741a3324e80d530fa1962d199f516be8b7865898bde4f2bc1186b7e9279\": container with ID starting with 228a5741a3324e80d530fa1962d199f516be8b7865898bde4f2bc1186b7e9279 not found: ID does not exist" Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.295654 4907 scope.go:117] "RemoveContainer" containerID="a9374ba24f9aec4d57f050d28efb2010097fefa09d8c97f3921e5f7292cb4967" Nov 29 16:03:00 crc kubenswrapper[4907]: E1129 16:03:00.299859 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9374ba24f9aec4d57f050d28efb2010097fefa09d8c97f3921e5f7292cb4967\": container with ID starting with a9374ba24f9aec4d57f050d28efb2010097fefa09d8c97f3921e5f7292cb4967 not found: ID does not exist" containerID="a9374ba24f9aec4d57f050d28efb2010097fefa09d8c97f3921e5f7292cb4967" Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.300078 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9374ba24f9aec4d57f050d28efb2010097fefa09d8c97f3921e5f7292cb4967"} err="failed to get container status \"a9374ba24f9aec4d57f050d28efb2010097fefa09d8c97f3921e5f7292cb4967\": rpc error: code = NotFound desc = could not find container \"a9374ba24f9aec4d57f050d28efb2010097fefa09d8c97f3921e5f7292cb4967\": container with ID starting with a9374ba24f9aec4d57f050d28efb2010097fefa09d8c97f3921e5f7292cb4967 not found: ID does not exist" Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.300103 4907 scope.go:117] "RemoveContainer" containerID="69b370615064edfef06b4a1d6828eca270e5db34245eea3dbf6b3abee10b60ae" Nov 29 16:03:00 crc kubenswrapper[4907]: E1129 16:03:00.300517 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69b370615064edfef06b4a1d6828eca270e5db34245eea3dbf6b3abee10b60ae\": container with ID starting with 69b370615064edfef06b4a1d6828eca270e5db34245eea3dbf6b3abee10b60ae not found: ID does not exist" containerID="69b370615064edfef06b4a1d6828eca270e5db34245eea3dbf6b3abee10b60ae" Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.300537 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69b370615064edfef06b4a1d6828eca270e5db34245eea3dbf6b3abee10b60ae"} err="failed to get container status \"69b370615064edfef06b4a1d6828eca270e5db34245eea3dbf6b3abee10b60ae\": rpc error: code = NotFound desc = could not find container \"69b370615064edfef06b4a1d6828eca270e5db34245eea3dbf6b3abee10b60ae\": container with ID starting with 69b370615064edfef06b4a1d6828eca270e5db34245eea3dbf6b3abee10b60ae not found: ID does not exist" Nov 29 16:03:00 crc kubenswrapper[4907]: I1129 16:03:00.498787 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b026e40c-0a1c-4868-96b6-a4eca5c227a0" path="/var/lib/kubelet/pods/b026e40c-0a1c-4868-96b6-a4eca5c227a0/volumes" Nov 29 16:03:28 crc kubenswrapper[4907]: I1129 16:03:28.495657 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 16:03:28 crc kubenswrapper[4907]: I1129 16:03:28.496098 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.170843 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fr4nh"] Nov 29 16:03:29 crc kubenswrapper[4907]: E1129 16:03:29.171801 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b026e40c-0a1c-4868-96b6-a4eca5c227a0" containerName="extract-content" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.171833 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b026e40c-0a1c-4868-96b6-a4eca5c227a0" containerName="extract-content" Nov 29 16:03:29 crc kubenswrapper[4907]: E1129 16:03:29.171898 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b026e40c-0a1c-4868-96b6-a4eca5c227a0" containerName="extract-utilities" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.171913 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b026e40c-0a1c-4868-96b6-a4eca5c227a0" containerName="extract-utilities" Nov 29 16:03:29 crc kubenswrapper[4907]: E1129 16:03:29.171976 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b026e40c-0a1c-4868-96b6-a4eca5c227a0" containerName="registry-server" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.171990 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b026e40c-0a1c-4868-96b6-a4eca5c227a0" containerName="registry-server" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.172427 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b026e40c-0a1c-4868-96b6-a4eca5c227a0" containerName="registry-server" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.175799 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.181220 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fr4nh"] Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.338862 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fdjm\" (UniqueName: \"kubernetes.io/projected/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-kube-api-access-5fdjm\") pod \"redhat-operators-fr4nh\" (UID: \"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c\") " pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.339278 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-utilities\") pod \"redhat-operators-fr4nh\" (UID: \"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c\") " pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.339338 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-catalog-content\") pod \"redhat-operators-fr4nh\" (UID: \"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c\") " pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.441761 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-utilities\") pod \"redhat-operators-fr4nh\" (UID: \"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c\") " pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.441839 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-catalog-content\") pod \"redhat-operators-fr4nh\" (UID: \"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c\") " pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.442015 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fdjm\" (UniqueName: \"kubernetes.io/projected/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-kube-api-access-5fdjm\") pod \"redhat-operators-fr4nh\" (UID: \"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c\") " pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.442770 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-utilities\") pod \"redhat-operators-fr4nh\" (UID: \"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c\") " pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.442849 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-catalog-content\") pod \"redhat-operators-fr4nh\" (UID: \"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c\") " pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.482049 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fdjm\" (UniqueName: \"kubernetes.io/projected/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-kube-api-access-5fdjm\") pod \"redhat-operators-fr4nh\" (UID: \"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c\") " pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:03:29 crc kubenswrapper[4907]: I1129 16:03:29.530366 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:03:30 crc kubenswrapper[4907]: I1129 16:03:30.040985 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fr4nh"] Nov 29 16:03:30 crc kubenswrapper[4907]: W1129 16:03:30.051833 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb94279d3_3cc2_43f3_ac1f_ccb8cc420c4c.slice/crio-695b11add8d9b406349f155516af8101bfca38f9188f095a4e209cd8c4899ced WatchSource:0}: Error finding container 695b11add8d9b406349f155516af8101bfca38f9188f095a4e209cd8c4899ced: Status 404 returned error can't find the container with id 695b11add8d9b406349f155516af8101bfca38f9188f095a4e209cd8c4899ced Nov 29 16:03:30 crc kubenswrapper[4907]: I1129 16:03:30.587938 4907 generic.go:334] "Generic (PLEG): container finished" podID="b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" containerID="5b206dde16298b77483af6b7f0087df70616485f1c611a4b86ba881405442ff6" exitCode=0 Nov 29 16:03:30 crc kubenswrapper[4907]: I1129 16:03:30.588044 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr4nh" event={"ID":"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c","Type":"ContainerDied","Data":"5b206dde16298b77483af6b7f0087df70616485f1c611a4b86ba881405442ff6"} Nov 29 16:03:30 crc kubenswrapper[4907]: I1129 16:03:30.588283 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr4nh" event={"ID":"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c","Type":"ContainerStarted","Data":"695b11add8d9b406349f155516af8101bfca38f9188f095a4e209cd8c4899ced"} Nov 29 16:03:32 crc kubenswrapper[4907]: I1129 16:03:32.615650 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr4nh" event={"ID":"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c","Type":"ContainerStarted","Data":"d72d0cb0001768ab384f55c593ae13369ec73df9d54ebccd0f13bbdba6e981f7"} Nov 29 16:03:35 crc kubenswrapper[4907]: I1129 16:03:35.650325 4907 generic.go:334] "Generic (PLEG): container finished" podID="b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" containerID="d72d0cb0001768ab384f55c593ae13369ec73df9d54ebccd0f13bbdba6e981f7" exitCode=0 Nov 29 16:03:35 crc kubenswrapper[4907]: I1129 16:03:35.650407 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr4nh" event={"ID":"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c","Type":"ContainerDied","Data":"d72d0cb0001768ab384f55c593ae13369ec73df9d54ebccd0f13bbdba6e981f7"} Nov 29 16:03:36 crc kubenswrapper[4907]: I1129 16:03:36.676464 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr4nh" event={"ID":"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c","Type":"ContainerStarted","Data":"9cf0750bc25e69960fa893f4871c37cb60050065ca5e75f2c9c96eaed59d0ae6"} Nov 29 16:03:39 crc kubenswrapper[4907]: I1129 16:03:39.530520 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:03:39 crc kubenswrapper[4907]: I1129 16:03:39.530789 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:03:40 crc kubenswrapper[4907]: I1129 16:03:40.595318 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fr4nh" podUID="b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" containerName="registry-server" probeResult="failure" output=< Nov 29 16:03:40 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 16:03:40 crc kubenswrapper[4907]: > Nov 29 16:03:51 crc kubenswrapper[4907]: I1129 16:03:51.344068 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fr4nh" podUID="b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" containerName="registry-server" probeResult="failure" output=< Nov 29 16:03:51 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 16:03:51 crc kubenswrapper[4907]: > Nov 29 16:03:58 crc kubenswrapper[4907]: I1129 16:03:58.489880 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 16:03:58 crc kubenswrapper[4907]: I1129 16:03:58.490614 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 16:03:59 crc kubenswrapper[4907]: I1129 16:03:59.596005 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:03:59 crc kubenswrapper[4907]: I1129 16:03:59.620056 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fr4nh" podStartSLOduration=24.996765669 podStartE2EDuration="30.620026505s" podCreationTimestamp="2025-11-29 16:03:29 +0000 UTC" firstStartedPulling="2025-11-29 16:03:30.590110495 +0000 UTC m=+5708.576948187" lastFinishedPulling="2025-11-29 16:03:36.213371331 +0000 UTC m=+5714.200209023" observedRunningTime="2025-11-29 16:03:36.706627858 +0000 UTC m=+5714.693465530" watchObservedRunningTime="2025-11-29 16:03:59.620026505 +0000 UTC m=+5737.606864187" Nov 29 16:03:59 crc kubenswrapper[4907]: I1129 16:03:59.666483 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:04:00 crc kubenswrapper[4907]: I1129 16:04:00.371508 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fr4nh"] Nov 29 16:04:00 crc kubenswrapper[4907]: I1129 16:04:00.988654 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fr4nh" podUID="b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" containerName="registry-server" containerID="cri-o://9cf0750bc25e69960fa893f4871c37cb60050065ca5e75f2c9c96eaed59d0ae6" gracePeriod=2 Nov 29 16:04:01 crc kubenswrapper[4907]: I1129 16:04:01.729334 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:04:01 crc kubenswrapper[4907]: I1129 16:04:01.887194 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-catalog-content\") pod \"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c\" (UID: \"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c\") " Nov 29 16:04:01 crc kubenswrapper[4907]: I1129 16:04:01.887291 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fdjm\" (UniqueName: \"kubernetes.io/projected/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-kube-api-access-5fdjm\") pod \"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c\" (UID: \"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c\") " Nov 29 16:04:01 crc kubenswrapper[4907]: I1129 16:04:01.887309 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-utilities\") pod \"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c\" (UID: \"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c\") " Nov 29 16:04:01 crc kubenswrapper[4907]: I1129 16:04:01.888612 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-utilities" (OuterVolumeSpecName: "utilities") pod "b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" (UID: "b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:04:01 crc kubenswrapper[4907]: I1129 16:04:01.896671 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-kube-api-access-5fdjm" (OuterVolumeSpecName: "kube-api-access-5fdjm") pod "b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" (UID: "b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c"). InnerVolumeSpecName "kube-api-access-5fdjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:04:01 crc kubenswrapper[4907]: I1129 16:04:01.990849 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fdjm\" (UniqueName: \"kubernetes.io/projected/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-kube-api-access-5fdjm\") on node \"crc\" DevicePath \"\"" Nov 29 16:04:01 crc kubenswrapper[4907]: I1129 16:04:01.990902 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.006059 4907 generic.go:334] "Generic (PLEG): container finished" podID="b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" containerID="9cf0750bc25e69960fa893f4871c37cb60050065ca5e75f2c9c96eaed59d0ae6" exitCode=0 Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.006142 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr4nh" event={"ID":"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c","Type":"ContainerDied","Data":"9cf0750bc25e69960fa893f4871c37cb60050065ca5e75f2c9c96eaed59d0ae6"} Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.006162 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fr4nh" Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.006199 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr4nh" event={"ID":"b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c","Type":"ContainerDied","Data":"695b11add8d9b406349f155516af8101bfca38f9188f095a4e209cd8c4899ced"} Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.006245 4907 scope.go:117] "RemoveContainer" containerID="9cf0750bc25e69960fa893f4871c37cb60050065ca5e75f2c9c96eaed59d0ae6" Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.022172 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" (UID: "b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.036693 4907 scope.go:117] "RemoveContainer" containerID="d72d0cb0001768ab384f55c593ae13369ec73df9d54ebccd0f13bbdba6e981f7" Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.076211 4907 scope.go:117] "RemoveContainer" containerID="5b206dde16298b77483af6b7f0087df70616485f1c611a4b86ba881405442ff6" Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.094255 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.148189 4907 scope.go:117] "RemoveContainer" containerID="9cf0750bc25e69960fa893f4871c37cb60050065ca5e75f2c9c96eaed59d0ae6" Nov 29 16:04:02 crc kubenswrapper[4907]: E1129 16:04:02.149921 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cf0750bc25e69960fa893f4871c37cb60050065ca5e75f2c9c96eaed59d0ae6\": container with ID starting with 9cf0750bc25e69960fa893f4871c37cb60050065ca5e75f2c9c96eaed59d0ae6 not found: ID does not exist" containerID="9cf0750bc25e69960fa893f4871c37cb60050065ca5e75f2c9c96eaed59d0ae6" Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.149982 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf0750bc25e69960fa893f4871c37cb60050065ca5e75f2c9c96eaed59d0ae6"} err="failed to get container status \"9cf0750bc25e69960fa893f4871c37cb60050065ca5e75f2c9c96eaed59d0ae6\": rpc error: code = NotFound desc = could not find container \"9cf0750bc25e69960fa893f4871c37cb60050065ca5e75f2c9c96eaed59d0ae6\": container with ID starting with 9cf0750bc25e69960fa893f4871c37cb60050065ca5e75f2c9c96eaed59d0ae6 not found: ID does not exist" Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.150028 4907 scope.go:117] "RemoveContainer" containerID="d72d0cb0001768ab384f55c593ae13369ec73df9d54ebccd0f13bbdba6e981f7" Nov 29 16:04:02 crc kubenswrapper[4907]: E1129 16:04:02.150424 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d72d0cb0001768ab384f55c593ae13369ec73df9d54ebccd0f13bbdba6e981f7\": container with ID starting with d72d0cb0001768ab384f55c593ae13369ec73df9d54ebccd0f13bbdba6e981f7 not found: ID does not exist" containerID="d72d0cb0001768ab384f55c593ae13369ec73df9d54ebccd0f13bbdba6e981f7" Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.150633 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d72d0cb0001768ab384f55c593ae13369ec73df9d54ebccd0f13bbdba6e981f7"} err="failed to get container status \"d72d0cb0001768ab384f55c593ae13369ec73df9d54ebccd0f13bbdba6e981f7\": rpc error: code = NotFound desc = could not find container \"d72d0cb0001768ab384f55c593ae13369ec73df9d54ebccd0f13bbdba6e981f7\": container with ID starting with d72d0cb0001768ab384f55c593ae13369ec73df9d54ebccd0f13bbdba6e981f7 not found: ID does not exist" Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.150662 4907 scope.go:117] "RemoveContainer" containerID="5b206dde16298b77483af6b7f0087df70616485f1c611a4b86ba881405442ff6" Nov 29 16:04:02 crc kubenswrapper[4907]: E1129 16:04:02.151074 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b206dde16298b77483af6b7f0087df70616485f1c611a4b86ba881405442ff6\": container with ID starting with 5b206dde16298b77483af6b7f0087df70616485f1c611a4b86ba881405442ff6 not found: ID does not exist" containerID="5b206dde16298b77483af6b7f0087df70616485f1c611a4b86ba881405442ff6" Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.151113 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b206dde16298b77483af6b7f0087df70616485f1c611a4b86ba881405442ff6"} err="failed to get container status \"5b206dde16298b77483af6b7f0087df70616485f1c611a4b86ba881405442ff6\": rpc error: code = NotFound desc = could not find container \"5b206dde16298b77483af6b7f0087df70616485f1c611a4b86ba881405442ff6\": container with ID starting with 5b206dde16298b77483af6b7f0087df70616485f1c611a4b86ba881405442ff6 not found: ID does not exist" Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.369322 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fr4nh"] Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.384422 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fr4nh"] Nov 29 16:04:02 crc kubenswrapper[4907]: I1129 16:04:02.502921 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" path="/var/lib/kubelet/pods/b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c/volumes" Nov 29 16:04:28 crc kubenswrapper[4907]: I1129 16:04:28.490069 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 16:04:28 crc kubenswrapper[4907]: I1129 16:04:28.490546 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 16:04:28 crc kubenswrapper[4907]: I1129 16:04:28.492215 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 16:04:28 crc kubenswrapper[4907]: I1129 16:04:28.493220 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 16:04:28 crc kubenswrapper[4907]: I1129 16:04:28.493304 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" gracePeriod=600 Nov 29 16:04:28 crc kubenswrapper[4907]: E1129 16:04:28.621247 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:04:29 crc kubenswrapper[4907]: I1129 16:04:29.418069 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" exitCode=0 Nov 29 16:04:29 crc kubenswrapper[4907]: I1129 16:04:29.418154 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e"} Nov 29 16:04:29 crc kubenswrapper[4907]: I1129 16:04:29.418216 4907 scope.go:117] "RemoveContainer" containerID="45f44dad29ce701e8fc1cdcdaedb9a8d53599d4534d0ecec8bc3e9265b44b92a" Nov 29 16:04:29 crc kubenswrapper[4907]: I1129 16:04:29.420355 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:04:29 crc kubenswrapper[4907]: E1129 16:04:29.421849 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:04:44 crc kubenswrapper[4907]: I1129 16:04:44.480616 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:04:44 crc kubenswrapper[4907]: E1129 16:04:44.481317 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:04:58 crc kubenswrapper[4907]: I1129 16:04:58.480080 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:04:58 crc kubenswrapper[4907]: E1129 16:04:58.481472 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:05:09 crc kubenswrapper[4907]: I1129 16:05:09.480298 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:05:09 crc kubenswrapper[4907]: E1129 16:05:09.481679 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:05:24 crc kubenswrapper[4907]: I1129 16:05:24.182552 4907 generic.go:334] "Generic (PLEG): container finished" podID="e689bb5a-7b28-48c6-995f-bc0dc07078de" containerID="da80da5f18935b57f85bdec8a5c210ee17e62bfc05d60865d2ef53d76541deef" exitCode=0 Nov 29 16:05:24 crc kubenswrapper[4907]: I1129 16:05:24.182665 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e689bb5a-7b28-48c6-995f-bc0dc07078de","Type":"ContainerDied","Data":"da80da5f18935b57f85bdec8a5c210ee17e62bfc05d60865d2ef53d76541deef"} Nov 29 16:05:24 crc kubenswrapper[4907]: I1129 16:05:24.481322 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:05:24 crc kubenswrapper[4907]: E1129 16:05:24.482158 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.595547 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.649278 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"e689bb5a-7b28-48c6-995f-bc0dc07078de\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.649344 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-openstack-config-secret\") pod \"e689bb5a-7b28-48c6-995f-bc0dc07078de\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.649387 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e689bb5a-7b28-48c6-995f-bc0dc07078de-config-data\") pod \"e689bb5a-7b28-48c6-995f-bc0dc07078de\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.649407 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-ca-certs\") pod \"e689bb5a-7b28-48c6-995f-bc0dc07078de\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.649457 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e689bb5a-7b28-48c6-995f-bc0dc07078de-test-operator-ephemeral-temporary\") pod \"e689bb5a-7b28-48c6-995f-bc0dc07078de\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.649760 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e689bb5a-7b28-48c6-995f-bc0dc07078de-test-operator-ephemeral-workdir\") pod \"e689bb5a-7b28-48c6-995f-bc0dc07078de\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.652150 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92s7r\" (UniqueName: \"kubernetes.io/projected/e689bb5a-7b28-48c6-995f-bc0dc07078de-kube-api-access-92s7r\") pod \"e689bb5a-7b28-48c6-995f-bc0dc07078de\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.651715 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e689bb5a-7b28-48c6-995f-bc0dc07078de-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "e689bb5a-7b28-48c6-995f-bc0dc07078de" (UID: "e689bb5a-7b28-48c6-995f-bc0dc07078de"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.653071 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-ssh-key\") pod \"e689bb5a-7b28-48c6-995f-bc0dc07078de\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.653103 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e689bb5a-7b28-48c6-995f-bc0dc07078de-config-data" (OuterVolumeSpecName: "config-data") pod "e689bb5a-7b28-48c6-995f-bc0dc07078de" (UID: "e689bb5a-7b28-48c6-995f-bc0dc07078de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.653395 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e689bb5a-7b28-48c6-995f-bc0dc07078de-openstack-config\") pod \"e689bb5a-7b28-48c6-995f-bc0dc07078de\" (UID: \"e689bb5a-7b28-48c6-995f-bc0dc07078de\") " Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.654886 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e689bb5a-7b28-48c6-995f-bc0dc07078de-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.654934 4907 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e689bb5a-7b28-48c6-995f-bc0dc07078de-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.661054 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e689bb5a-7b28-48c6-995f-bc0dc07078de-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "e689bb5a-7b28-48c6-995f-bc0dc07078de" (UID: "e689bb5a-7b28-48c6-995f-bc0dc07078de"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.661628 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "e689bb5a-7b28-48c6-995f-bc0dc07078de" (UID: "e689bb5a-7b28-48c6-995f-bc0dc07078de"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.663423 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e689bb5a-7b28-48c6-995f-bc0dc07078de-kube-api-access-92s7r" (OuterVolumeSpecName: "kube-api-access-92s7r") pod "e689bb5a-7b28-48c6-995f-bc0dc07078de" (UID: "e689bb5a-7b28-48c6-995f-bc0dc07078de"). InnerVolumeSpecName "kube-api-access-92s7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.705193 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e689bb5a-7b28-48c6-995f-bc0dc07078de" (UID: "e689bb5a-7b28-48c6-995f-bc0dc07078de"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.712975 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "e689bb5a-7b28-48c6-995f-bc0dc07078de" (UID: "e689bb5a-7b28-48c6-995f-bc0dc07078de"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.715231 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e689bb5a-7b28-48c6-995f-bc0dc07078de" (UID: "e689bb5a-7b28-48c6-995f-bc0dc07078de"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.743370 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e689bb5a-7b28-48c6-995f-bc0dc07078de-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e689bb5a-7b28-48c6-995f-bc0dc07078de" (UID: "e689bb5a-7b28-48c6-995f-bc0dc07078de"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.757828 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e689bb5a-7b28-48c6-995f-bc0dc07078de-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.758219 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.758270 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.758285 4907 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-ca-certs\") on node \"crc\" DevicePath \"\"" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.758300 4907 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e689bb5a-7b28-48c6-995f-bc0dc07078de-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.758312 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92s7r\" (UniqueName: \"kubernetes.io/projected/e689bb5a-7b28-48c6-995f-bc0dc07078de-kube-api-access-92s7r\") on node \"crc\" DevicePath \"\"" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.758326 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e689bb5a-7b28-48c6-995f-bc0dc07078de-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.784087 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 29 16:05:26 crc kubenswrapper[4907]: I1129 16:05:26.861700 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 29 16:05:27 crc kubenswrapper[4907]: I1129 16:05:27.241335 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e689bb5a-7b28-48c6-995f-bc0dc07078de","Type":"ContainerDied","Data":"b548c1ce3c7dbe990310154a9407db9e1796867b63905ac16605c2ec24aaf538"} Nov 29 16:05:27 crc kubenswrapper[4907]: I1129 16:05:27.241377 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b548c1ce3c7dbe990310154a9407db9e1796867b63905ac16605c2ec24aaf538" Nov 29 16:05:27 crc kubenswrapper[4907]: I1129 16:05:27.241457 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 29 16:05:34 crc kubenswrapper[4907]: I1129 16:05:34.828325 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 29 16:05:34 crc kubenswrapper[4907]: E1129 16:05:34.829359 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e689bb5a-7b28-48c6-995f-bc0dc07078de" containerName="tempest-tests-tempest-tests-runner" Nov 29 16:05:34 crc kubenswrapper[4907]: I1129 16:05:34.829374 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e689bb5a-7b28-48c6-995f-bc0dc07078de" containerName="tempest-tests-tempest-tests-runner" Nov 29 16:05:34 crc kubenswrapper[4907]: E1129 16:05:34.829391 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" containerName="extract-utilities" Nov 29 16:05:34 crc kubenswrapper[4907]: I1129 16:05:34.829400 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" containerName="extract-utilities" Nov 29 16:05:34 crc kubenswrapper[4907]: E1129 16:05:34.829423 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" containerName="registry-server" Nov 29 16:05:34 crc kubenswrapper[4907]: I1129 16:05:34.829432 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" containerName="registry-server" Nov 29 16:05:34 crc kubenswrapper[4907]: E1129 16:05:34.829470 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" containerName="extract-content" Nov 29 16:05:34 crc kubenswrapper[4907]: I1129 16:05:34.829476 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" containerName="extract-content" Nov 29 16:05:34 crc kubenswrapper[4907]: I1129 16:05:34.829697 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e689bb5a-7b28-48c6-995f-bc0dc07078de" containerName="tempest-tests-tempest-tests-runner" Nov 29 16:05:34 crc kubenswrapper[4907]: I1129 16:05:34.829717 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b94279d3-3cc2-43f3-ac1f-ccb8cc420c4c" containerName="registry-server" Nov 29 16:05:34 crc kubenswrapper[4907]: I1129 16:05:34.830431 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 16:05:34 crc kubenswrapper[4907]: I1129 16:05:34.833807 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gwjbm" Nov 29 16:05:34 crc kubenswrapper[4907]: I1129 16:05:34.852711 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 29 16:05:34 crc kubenswrapper[4907]: I1129 16:05:34.985183 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"520ecac5-d7f0-4863-8d88-4789fcad831a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 16:05:34 crc kubenswrapper[4907]: I1129 16:05:34.985488 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5nmh\" (UniqueName: \"kubernetes.io/projected/520ecac5-d7f0-4863-8d88-4789fcad831a-kube-api-access-v5nmh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"520ecac5-d7f0-4863-8d88-4789fcad831a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 16:05:35 crc kubenswrapper[4907]: I1129 16:05:35.088721 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"520ecac5-d7f0-4863-8d88-4789fcad831a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 16:05:35 crc kubenswrapper[4907]: I1129 16:05:35.089036 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5nmh\" (UniqueName: \"kubernetes.io/projected/520ecac5-d7f0-4863-8d88-4789fcad831a-kube-api-access-v5nmh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"520ecac5-d7f0-4863-8d88-4789fcad831a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 16:05:35 crc kubenswrapper[4907]: I1129 16:05:35.089885 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"520ecac5-d7f0-4863-8d88-4789fcad831a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 16:05:35 crc kubenswrapper[4907]: I1129 16:05:35.117649 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5nmh\" (UniqueName: \"kubernetes.io/projected/520ecac5-d7f0-4863-8d88-4789fcad831a-kube-api-access-v5nmh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"520ecac5-d7f0-4863-8d88-4789fcad831a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 16:05:35 crc kubenswrapper[4907]: I1129 16:05:35.155332 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"520ecac5-d7f0-4863-8d88-4789fcad831a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 16:05:35 crc kubenswrapper[4907]: I1129 16:05:35.211129 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 29 16:05:35 crc kubenswrapper[4907]: I1129 16:05:35.755787 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 29 16:05:36 crc kubenswrapper[4907]: I1129 16:05:36.375466 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"520ecac5-d7f0-4863-8d88-4789fcad831a","Type":"ContainerStarted","Data":"730da89b989c424dfbf29ad79952f73c7d535db3884a6ad8c9f722f738f8c79c"} Nov 29 16:05:37 crc kubenswrapper[4907]: I1129 16:05:37.392153 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"520ecac5-d7f0-4863-8d88-4789fcad831a","Type":"ContainerStarted","Data":"a35723d06768a76944b5da9d8bb6cd1eaf93eb0d38b3317f1214a1ff5cf40ab6"} Nov 29 16:05:37 crc kubenswrapper[4907]: I1129 16:05:37.420301 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.153388 podStartE2EDuration="3.42027776s" podCreationTimestamp="2025-11-29 16:05:34 +0000 UTC" firstStartedPulling="2025-11-29 16:05:35.758782475 +0000 UTC m=+5833.745620137" lastFinishedPulling="2025-11-29 16:05:37.025672225 +0000 UTC m=+5835.012509897" observedRunningTime="2025-11-29 16:05:37.409537865 +0000 UTC m=+5835.396375547" watchObservedRunningTime="2025-11-29 16:05:37.42027776 +0000 UTC m=+5835.407115432" Nov 29 16:05:39 crc kubenswrapper[4907]: I1129 16:05:39.485689 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:05:39 crc kubenswrapper[4907]: E1129 16:05:39.486487 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:05:53 crc kubenswrapper[4907]: I1129 16:05:53.480616 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:05:53 crc kubenswrapper[4907]: E1129 16:05:53.481346 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:06:05 crc kubenswrapper[4907]: I1129 16:06:05.480395 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:06:05 crc kubenswrapper[4907]: E1129 16:06:05.481152 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:06:05 crc kubenswrapper[4907]: I1129 16:06:05.515668 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2tpnk/must-gather-xbmcr"] Nov 29 16:06:05 crc kubenswrapper[4907]: I1129 16:06:05.517977 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/must-gather-xbmcr" Nov 29 16:06:05 crc kubenswrapper[4907]: I1129 16:06:05.520088 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2tpnk"/"default-dockercfg-h22dz" Nov 29 16:06:05 crc kubenswrapper[4907]: I1129 16:06:05.520197 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2tpnk"/"openshift-service-ca.crt" Nov 29 16:06:05 crc kubenswrapper[4907]: I1129 16:06:05.520944 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2tpnk"/"kube-root-ca.crt" Nov 29 16:06:05 crc kubenswrapper[4907]: I1129 16:06:05.527641 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c1390a38-2afc-4b68-bf02-d2257ac3ef8c-must-gather-output\") pod \"must-gather-xbmcr\" (UID: \"c1390a38-2afc-4b68-bf02-d2257ac3ef8c\") " pod="openshift-must-gather-2tpnk/must-gather-xbmcr" Nov 29 16:06:05 crc kubenswrapper[4907]: I1129 16:06:05.527691 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tztg\" (UniqueName: \"kubernetes.io/projected/c1390a38-2afc-4b68-bf02-d2257ac3ef8c-kube-api-access-4tztg\") pod \"must-gather-xbmcr\" (UID: \"c1390a38-2afc-4b68-bf02-d2257ac3ef8c\") " pod="openshift-must-gather-2tpnk/must-gather-xbmcr" Nov 29 16:06:05 crc kubenswrapper[4907]: I1129 16:06:05.543861 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2tpnk/must-gather-xbmcr"] Nov 29 16:06:05 crc kubenswrapper[4907]: I1129 16:06:05.630484 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c1390a38-2afc-4b68-bf02-d2257ac3ef8c-must-gather-output\") pod \"must-gather-xbmcr\" (UID: \"c1390a38-2afc-4b68-bf02-d2257ac3ef8c\") " pod="openshift-must-gather-2tpnk/must-gather-xbmcr" Nov 29 16:06:05 crc kubenswrapper[4907]: I1129 16:06:05.630533 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tztg\" (UniqueName: \"kubernetes.io/projected/c1390a38-2afc-4b68-bf02-d2257ac3ef8c-kube-api-access-4tztg\") pod \"must-gather-xbmcr\" (UID: \"c1390a38-2afc-4b68-bf02-d2257ac3ef8c\") " pod="openshift-must-gather-2tpnk/must-gather-xbmcr" Nov 29 16:06:05 crc kubenswrapper[4907]: I1129 16:06:05.630921 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c1390a38-2afc-4b68-bf02-d2257ac3ef8c-must-gather-output\") pod \"must-gather-xbmcr\" (UID: \"c1390a38-2afc-4b68-bf02-d2257ac3ef8c\") " pod="openshift-must-gather-2tpnk/must-gather-xbmcr" Nov 29 16:06:05 crc kubenswrapper[4907]: I1129 16:06:05.650848 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tztg\" (UniqueName: \"kubernetes.io/projected/c1390a38-2afc-4b68-bf02-d2257ac3ef8c-kube-api-access-4tztg\") pod \"must-gather-xbmcr\" (UID: \"c1390a38-2afc-4b68-bf02-d2257ac3ef8c\") " pod="openshift-must-gather-2tpnk/must-gather-xbmcr" Nov 29 16:06:05 crc kubenswrapper[4907]: I1129 16:06:05.844143 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/must-gather-xbmcr" Nov 29 16:06:06 crc kubenswrapper[4907]: I1129 16:06:06.350863 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2tpnk/must-gather-xbmcr"] Nov 29 16:06:06 crc kubenswrapper[4907]: I1129 16:06:06.369564 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 16:06:06 crc kubenswrapper[4907]: I1129 16:06:06.784248 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2tpnk/must-gather-xbmcr" event={"ID":"c1390a38-2afc-4b68-bf02-d2257ac3ef8c","Type":"ContainerStarted","Data":"92823c362269327569ab37c351cedc8f5444ddd5e51f63f162a903931966a421"} Nov 29 16:06:12 crc kubenswrapper[4907]: I1129 16:06:12.857498 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2tpnk/must-gather-xbmcr" event={"ID":"c1390a38-2afc-4b68-bf02-d2257ac3ef8c","Type":"ContainerStarted","Data":"cf36842898855be5d86c9e2aaf94a99333319cafcd30591a897347da37919403"} Nov 29 16:06:12 crc kubenswrapper[4907]: I1129 16:06:12.857844 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2tpnk/must-gather-xbmcr" event={"ID":"c1390a38-2afc-4b68-bf02-d2257ac3ef8c","Type":"ContainerStarted","Data":"a8025c3e736baea58edc7c18415d1269582eeb721a82ad994bf328c531aa26b8"} Nov 29 16:06:12 crc kubenswrapper[4907]: I1129 16:06:12.885392 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2tpnk/must-gather-xbmcr" podStartSLOduration=2.3696642 podStartE2EDuration="7.885369807s" podCreationTimestamp="2025-11-29 16:06:05 +0000 UTC" firstStartedPulling="2025-11-29 16:06:06.369332772 +0000 UTC m=+5864.356170424" lastFinishedPulling="2025-11-29 16:06:11.885038349 +0000 UTC m=+5869.871876031" observedRunningTime="2025-11-29 16:06:12.878857273 +0000 UTC m=+5870.865694965" watchObservedRunningTime="2025-11-29 16:06:12.885369807 +0000 UTC m=+5870.872207459" Nov 29 16:06:16 crc kubenswrapper[4907]: I1129 16:06:16.867029 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2tpnk/crc-debug-6vf5c"] Nov 29 16:06:16 crc kubenswrapper[4907]: I1129 16:06:16.869343 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/crc-debug-6vf5c" Nov 29 16:06:17 crc kubenswrapper[4907]: I1129 16:06:17.026189 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fjqx\" (UniqueName: \"kubernetes.io/projected/1254234b-673c-4047-bb82-6ac7d8949559-kube-api-access-7fjqx\") pod \"crc-debug-6vf5c\" (UID: \"1254234b-673c-4047-bb82-6ac7d8949559\") " pod="openshift-must-gather-2tpnk/crc-debug-6vf5c" Nov 29 16:06:17 crc kubenswrapper[4907]: I1129 16:06:17.026808 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1254234b-673c-4047-bb82-6ac7d8949559-host\") pod \"crc-debug-6vf5c\" (UID: \"1254234b-673c-4047-bb82-6ac7d8949559\") " pod="openshift-must-gather-2tpnk/crc-debug-6vf5c" Nov 29 16:06:17 crc kubenswrapper[4907]: I1129 16:06:17.130961 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fjqx\" (UniqueName: \"kubernetes.io/projected/1254234b-673c-4047-bb82-6ac7d8949559-kube-api-access-7fjqx\") pod \"crc-debug-6vf5c\" (UID: \"1254234b-673c-4047-bb82-6ac7d8949559\") " pod="openshift-must-gather-2tpnk/crc-debug-6vf5c" Nov 29 16:06:17 crc kubenswrapper[4907]: I1129 16:06:17.131574 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1254234b-673c-4047-bb82-6ac7d8949559-host\") pod \"crc-debug-6vf5c\" (UID: \"1254234b-673c-4047-bb82-6ac7d8949559\") " pod="openshift-must-gather-2tpnk/crc-debug-6vf5c" Nov 29 16:06:17 crc kubenswrapper[4907]: I1129 16:06:17.131860 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1254234b-673c-4047-bb82-6ac7d8949559-host\") pod \"crc-debug-6vf5c\" (UID: \"1254234b-673c-4047-bb82-6ac7d8949559\") " pod="openshift-must-gather-2tpnk/crc-debug-6vf5c" Nov 29 16:06:17 crc kubenswrapper[4907]: I1129 16:06:17.156202 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fjqx\" (UniqueName: \"kubernetes.io/projected/1254234b-673c-4047-bb82-6ac7d8949559-kube-api-access-7fjqx\") pod \"crc-debug-6vf5c\" (UID: \"1254234b-673c-4047-bb82-6ac7d8949559\") " pod="openshift-must-gather-2tpnk/crc-debug-6vf5c" Nov 29 16:06:17 crc kubenswrapper[4907]: I1129 16:06:17.189421 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/crc-debug-6vf5c" Nov 29 16:06:17 crc kubenswrapper[4907]: W1129 16:06:17.218550 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1254234b_673c_4047_bb82_6ac7d8949559.slice/crio-00db15e20978323b54e3588470b2b7cf858bb05984a0a2a913c16e39ee6257f8 WatchSource:0}: Error finding container 00db15e20978323b54e3588470b2b7cf858bb05984a0a2a913c16e39ee6257f8: Status 404 returned error can't find the container with id 00db15e20978323b54e3588470b2b7cf858bb05984a0a2a913c16e39ee6257f8 Nov 29 16:06:17 crc kubenswrapper[4907]: I1129 16:06:17.913530 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2tpnk/crc-debug-6vf5c" event={"ID":"1254234b-673c-4047-bb82-6ac7d8949559","Type":"ContainerStarted","Data":"00db15e20978323b54e3588470b2b7cf858bb05984a0a2a913c16e39ee6257f8"} Nov 29 16:06:19 crc kubenswrapper[4907]: I1129 16:06:19.480241 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:06:19 crc kubenswrapper[4907]: E1129 16:06:19.480899 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:06:31 crc kubenswrapper[4907]: I1129 16:06:31.068640 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2tpnk/crc-debug-6vf5c" event={"ID":"1254234b-673c-4047-bb82-6ac7d8949559","Type":"ContainerStarted","Data":"98668a58006093d6addfe0d5404af358e7022e6a9b937c57ce46a03664aa9f05"} Nov 29 16:06:31 crc kubenswrapper[4907]: I1129 16:06:31.101030 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2tpnk/crc-debug-6vf5c" podStartSLOduration=2.426914326 podStartE2EDuration="15.101001241s" podCreationTimestamp="2025-11-29 16:06:16 +0000 UTC" firstStartedPulling="2025-11-29 16:06:17.220905963 +0000 UTC m=+5875.207743615" lastFinishedPulling="2025-11-29 16:06:29.894992868 +0000 UTC m=+5887.881830530" observedRunningTime="2025-11-29 16:06:31.090569195 +0000 UTC m=+5889.077406847" watchObservedRunningTime="2025-11-29 16:06:31.101001241 +0000 UTC m=+5889.087838893" Nov 29 16:06:31 crc kubenswrapper[4907]: I1129 16:06:31.480388 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:06:31 crc kubenswrapper[4907]: E1129 16:06:31.480905 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:06:42 crc kubenswrapper[4907]: I1129 16:06:42.488309 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:06:42 crc kubenswrapper[4907]: E1129 16:06:42.488915 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:06:53 crc kubenswrapper[4907]: I1129 16:06:53.479953 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:06:53 crc kubenswrapper[4907]: E1129 16:06:53.480665 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:07:07 crc kubenswrapper[4907]: I1129 16:07:07.483853 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:07:07 crc kubenswrapper[4907]: E1129 16:07:07.484493 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:07:16 crc kubenswrapper[4907]: I1129 16:07:16.608248 4907 generic.go:334] "Generic (PLEG): container finished" podID="1254234b-673c-4047-bb82-6ac7d8949559" containerID="98668a58006093d6addfe0d5404af358e7022e6a9b937c57ce46a03664aa9f05" exitCode=0 Nov 29 16:07:16 crc kubenswrapper[4907]: I1129 16:07:16.608299 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2tpnk/crc-debug-6vf5c" event={"ID":"1254234b-673c-4047-bb82-6ac7d8949559","Type":"ContainerDied","Data":"98668a58006093d6addfe0d5404af358e7022e6a9b937c57ce46a03664aa9f05"} Nov 29 16:07:17 crc kubenswrapper[4907]: I1129 16:07:17.763562 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/crc-debug-6vf5c" Nov 29 16:07:17 crc kubenswrapper[4907]: I1129 16:07:17.798890 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2tpnk/crc-debug-6vf5c"] Nov 29 16:07:17 crc kubenswrapper[4907]: I1129 16:07:17.810225 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2tpnk/crc-debug-6vf5c"] Nov 29 16:07:17 crc kubenswrapper[4907]: I1129 16:07:17.848557 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fjqx\" (UniqueName: \"kubernetes.io/projected/1254234b-673c-4047-bb82-6ac7d8949559-kube-api-access-7fjqx\") pod \"1254234b-673c-4047-bb82-6ac7d8949559\" (UID: \"1254234b-673c-4047-bb82-6ac7d8949559\") " Nov 29 16:07:17 crc kubenswrapper[4907]: I1129 16:07:17.848715 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1254234b-673c-4047-bb82-6ac7d8949559-host\") pod \"1254234b-673c-4047-bb82-6ac7d8949559\" (UID: \"1254234b-673c-4047-bb82-6ac7d8949559\") " Nov 29 16:07:17 crc kubenswrapper[4907]: I1129 16:07:17.848844 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1254234b-673c-4047-bb82-6ac7d8949559-host" (OuterVolumeSpecName: "host") pod "1254234b-673c-4047-bb82-6ac7d8949559" (UID: "1254234b-673c-4047-bb82-6ac7d8949559"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 16:07:17 crc kubenswrapper[4907]: I1129 16:07:17.849423 4907 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1254234b-673c-4047-bb82-6ac7d8949559-host\") on node \"crc\" DevicePath \"\"" Nov 29 16:07:17 crc kubenswrapper[4907]: I1129 16:07:17.866245 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1254234b-673c-4047-bb82-6ac7d8949559-kube-api-access-7fjqx" (OuterVolumeSpecName: "kube-api-access-7fjqx") pod "1254234b-673c-4047-bb82-6ac7d8949559" (UID: "1254234b-673c-4047-bb82-6ac7d8949559"). InnerVolumeSpecName "kube-api-access-7fjqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:07:17 crc kubenswrapper[4907]: I1129 16:07:17.952069 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fjqx\" (UniqueName: \"kubernetes.io/projected/1254234b-673c-4047-bb82-6ac7d8949559-kube-api-access-7fjqx\") on node \"crc\" DevicePath \"\"" Nov 29 16:07:18 crc kubenswrapper[4907]: I1129 16:07:18.491348 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1254234b-673c-4047-bb82-6ac7d8949559" path="/var/lib/kubelet/pods/1254234b-673c-4047-bb82-6ac7d8949559/volumes" Nov 29 16:07:18 crc kubenswrapper[4907]: I1129 16:07:18.637933 4907 scope.go:117] "RemoveContainer" containerID="98668a58006093d6addfe0d5404af358e7022e6a9b937c57ce46a03664aa9f05" Nov 29 16:07:18 crc kubenswrapper[4907]: I1129 16:07:18.638024 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/crc-debug-6vf5c" Nov 29 16:07:19 crc kubenswrapper[4907]: I1129 16:07:19.039636 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2tpnk/crc-debug-5btkr"] Nov 29 16:07:19 crc kubenswrapper[4907]: E1129 16:07:19.040275 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1254234b-673c-4047-bb82-6ac7d8949559" containerName="container-00" Nov 29 16:07:19 crc kubenswrapper[4907]: I1129 16:07:19.040293 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1254234b-673c-4047-bb82-6ac7d8949559" containerName="container-00" Nov 29 16:07:19 crc kubenswrapper[4907]: I1129 16:07:19.040537 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1254234b-673c-4047-bb82-6ac7d8949559" containerName="container-00" Nov 29 16:07:19 crc kubenswrapper[4907]: I1129 16:07:19.041359 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/crc-debug-5btkr" Nov 29 16:07:19 crc kubenswrapper[4907]: I1129 16:07:19.176831 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr42b\" (UniqueName: \"kubernetes.io/projected/e596f733-d46a-4210-9351-50420781dd29-kube-api-access-gr42b\") pod \"crc-debug-5btkr\" (UID: \"e596f733-d46a-4210-9351-50420781dd29\") " pod="openshift-must-gather-2tpnk/crc-debug-5btkr" Nov 29 16:07:19 crc kubenswrapper[4907]: I1129 16:07:19.177175 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e596f733-d46a-4210-9351-50420781dd29-host\") pod \"crc-debug-5btkr\" (UID: \"e596f733-d46a-4210-9351-50420781dd29\") " pod="openshift-must-gather-2tpnk/crc-debug-5btkr" Nov 29 16:07:19 crc kubenswrapper[4907]: I1129 16:07:19.279371 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr42b\" (UniqueName: \"kubernetes.io/projected/e596f733-d46a-4210-9351-50420781dd29-kube-api-access-gr42b\") pod \"crc-debug-5btkr\" (UID: \"e596f733-d46a-4210-9351-50420781dd29\") " pod="openshift-must-gather-2tpnk/crc-debug-5btkr" Nov 29 16:07:19 crc kubenswrapper[4907]: I1129 16:07:19.279431 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e596f733-d46a-4210-9351-50420781dd29-host\") pod \"crc-debug-5btkr\" (UID: \"e596f733-d46a-4210-9351-50420781dd29\") " pod="openshift-must-gather-2tpnk/crc-debug-5btkr" Nov 29 16:07:19 crc kubenswrapper[4907]: I1129 16:07:19.279604 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e596f733-d46a-4210-9351-50420781dd29-host\") pod \"crc-debug-5btkr\" (UID: \"e596f733-d46a-4210-9351-50420781dd29\") " pod="openshift-must-gather-2tpnk/crc-debug-5btkr" Nov 29 16:07:19 crc kubenswrapper[4907]: I1129 16:07:19.307701 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr42b\" (UniqueName: \"kubernetes.io/projected/e596f733-d46a-4210-9351-50420781dd29-kube-api-access-gr42b\") pod \"crc-debug-5btkr\" (UID: \"e596f733-d46a-4210-9351-50420781dd29\") " pod="openshift-must-gather-2tpnk/crc-debug-5btkr" Nov 29 16:07:19 crc kubenswrapper[4907]: I1129 16:07:19.358850 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/crc-debug-5btkr" Nov 29 16:07:19 crc kubenswrapper[4907]: W1129 16:07:19.418459 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode596f733_d46a_4210_9351_50420781dd29.slice/crio-992e0263580c275778628636ff95bca4a9c5546af2c91797c0d82092fd508a03 WatchSource:0}: Error finding container 992e0263580c275778628636ff95bca4a9c5546af2c91797c0d82092fd508a03: Status 404 returned error can't find the container with id 992e0263580c275778628636ff95bca4a9c5546af2c91797c0d82092fd508a03 Nov 29 16:07:19 crc kubenswrapper[4907]: I1129 16:07:19.647541 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2tpnk/crc-debug-5btkr" event={"ID":"e596f733-d46a-4210-9351-50420781dd29","Type":"ContainerStarted","Data":"992e0263580c275778628636ff95bca4a9c5546af2c91797c0d82092fd508a03"} Nov 29 16:07:20 crc kubenswrapper[4907]: I1129 16:07:20.660512 4907 generic.go:334] "Generic (PLEG): container finished" podID="e596f733-d46a-4210-9351-50420781dd29" containerID="b3a97e0476c36fc5b45840a203d7cc83a5423c572c354909b5246861e622f099" exitCode=0 Nov 29 16:07:20 crc kubenswrapper[4907]: I1129 16:07:20.660559 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2tpnk/crc-debug-5btkr" event={"ID":"e596f733-d46a-4210-9351-50420781dd29","Type":"ContainerDied","Data":"b3a97e0476c36fc5b45840a203d7cc83a5423c572c354909b5246861e622f099"} Nov 29 16:07:21 crc kubenswrapper[4907]: I1129 16:07:21.802652 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/crc-debug-5btkr" Nov 29 16:07:21 crc kubenswrapper[4907]: I1129 16:07:21.934292 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e596f733-d46a-4210-9351-50420781dd29-host\") pod \"e596f733-d46a-4210-9351-50420781dd29\" (UID: \"e596f733-d46a-4210-9351-50420781dd29\") " Nov 29 16:07:21 crc kubenswrapper[4907]: I1129 16:07:21.934411 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr42b\" (UniqueName: \"kubernetes.io/projected/e596f733-d46a-4210-9351-50420781dd29-kube-api-access-gr42b\") pod \"e596f733-d46a-4210-9351-50420781dd29\" (UID: \"e596f733-d46a-4210-9351-50420781dd29\") " Nov 29 16:07:21 crc kubenswrapper[4907]: I1129 16:07:21.934547 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e596f733-d46a-4210-9351-50420781dd29-host" (OuterVolumeSpecName: "host") pod "e596f733-d46a-4210-9351-50420781dd29" (UID: "e596f733-d46a-4210-9351-50420781dd29"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 16:07:21 crc kubenswrapper[4907]: I1129 16:07:21.935565 4907 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e596f733-d46a-4210-9351-50420781dd29-host\") on node \"crc\" DevicePath \"\"" Nov 29 16:07:21 crc kubenswrapper[4907]: I1129 16:07:21.941675 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e596f733-d46a-4210-9351-50420781dd29-kube-api-access-gr42b" (OuterVolumeSpecName: "kube-api-access-gr42b") pod "e596f733-d46a-4210-9351-50420781dd29" (UID: "e596f733-d46a-4210-9351-50420781dd29"). InnerVolumeSpecName "kube-api-access-gr42b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:07:22 crc kubenswrapper[4907]: I1129 16:07:22.037100 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr42b\" (UniqueName: \"kubernetes.io/projected/e596f733-d46a-4210-9351-50420781dd29-kube-api-access-gr42b\") on node \"crc\" DevicePath \"\"" Nov 29 16:07:22 crc kubenswrapper[4907]: I1129 16:07:22.510542 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:07:22 crc kubenswrapper[4907]: E1129 16:07:22.511464 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:07:22 crc kubenswrapper[4907]: I1129 16:07:22.707455 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2tpnk/crc-debug-5btkr" event={"ID":"e596f733-d46a-4210-9351-50420781dd29","Type":"ContainerDied","Data":"992e0263580c275778628636ff95bca4a9c5546af2c91797c0d82092fd508a03"} Nov 29 16:07:22 crc kubenswrapper[4907]: I1129 16:07:22.707498 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="992e0263580c275778628636ff95bca4a9c5546af2c91797c0d82092fd508a03" Nov 29 16:07:22 crc kubenswrapper[4907]: I1129 16:07:22.707586 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/crc-debug-5btkr" Nov 29 16:07:22 crc kubenswrapper[4907]: I1129 16:07:22.875955 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2tpnk/crc-debug-5btkr"] Nov 29 16:07:22 crc kubenswrapper[4907]: I1129 16:07:22.890803 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2tpnk/crc-debug-5btkr"] Nov 29 16:07:24 crc kubenswrapper[4907]: I1129 16:07:24.117602 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2tpnk/crc-debug-dk9z7"] Nov 29 16:07:24 crc kubenswrapper[4907]: E1129 16:07:24.118380 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e596f733-d46a-4210-9351-50420781dd29" containerName="container-00" Nov 29 16:07:24 crc kubenswrapper[4907]: I1129 16:07:24.118395 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e596f733-d46a-4210-9351-50420781dd29" containerName="container-00" Nov 29 16:07:24 crc kubenswrapper[4907]: I1129 16:07:24.118717 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e596f733-d46a-4210-9351-50420781dd29" containerName="container-00" Nov 29 16:07:24 crc kubenswrapper[4907]: I1129 16:07:24.119689 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/crc-debug-dk9z7" Nov 29 16:07:24 crc kubenswrapper[4907]: I1129 16:07:24.183548 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/feb07f57-3822-43f0-ac31-e21721061bd8-host\") pod \"crc-debug-dk9z7\" (UID: \"feb07f57-3822-43f0-ac31-e21721061bd8\") " pod="openshift-must-gather-2tpnk/crc-debug-dk9z7" Nov 29 16:07:24 crc kubenswrapper[4907]: I1129 16:07:24.183658 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljk2p\" (UniqueName: \"kubernetes.io/projected/feb07f57-3822-43f0-ac31-e21721061bd8-kube-api-access-ljk2p\") pod \"crc-debug-dk9z7\" (UID: \"feb07f57-3822-43f0-ac31-e21721061bd8\") " pod="openshift-must-gather-2tpnk/crc-debug-dk9z7" Nov 29 16:07:24 crc kubenswrapper[4907]: I1129 16:07:24.286696 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/feb07f57-3822-43f0-ac31-e21721061bd8-host\") pod \"crc-debug-dk9z7\" (UID: \"feb07f57-3822-43f0-ac31-e21721061bd8\") " pod="openshift-must-gather-2tpnk/crc-debug-dk9z7" Nov 29 16:07:24 crc kubenswrapper[4907]: I1129 16:07:24.286801 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljk2p\" (UniqueName: \"kubernetes.io/projected/feb07f57-3822-43f0-ac31-e21721061bd8-kube-api-access-ljk2p\") pod \"crc-debug-dk9z7\" (UID: \"feb07f57-3822-43f0-ac31-e21721061bd8\") " pod="openshift-must-gather-2tpnk/crc-debug-dk9z7" Nov 29 16:07:24 crc kubenswrapper[4907]: I1129 16:07:24.287082 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/feb07f57-3822-43f0-ac31-e21721061bd8-host\") pod \"crc-debug-dk9z7\" (UID: \"feb07f57-3822-43f0-ac31-e21721061bd8\") " pod="openshift-must-gather-2tpnk/crc-debug-dk9z7" Nov 29 16:07:24 crc kubenswrapper[4907]: I1129 16:07:24.303953 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljk2p\" (UniqueName: \"kubernetes.io/projected/feb07f57-3822-43f0-ac31-e21721061bd8-kube-api-access-ljk2p\") pod \"crc-debug-dk9z7\" (UID: \"feb07f57-3822-43f0-ac31-e21721061bd8\") " pod="openshift-must-gather-2tpnk/crc-debug-dk9z7" Nov 29 16:07:24 crc kubenswrapper[4907]: I1129 16:07:24.441634 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/crc-debug-dk9z7" Nov 29 16:07:24 crc kubenswrapper[4907]: W1129 16:07:24.483390 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeb07f57_3822_43f0_ac31_e21721061bd8.slice/crio-2d86189a459e02dd3e39a783ead6cd09a310ae066b65d7548599ee267191c2d9 WatchSource:0}: Error finding container 2d86189a459e02dd3e39a783ead6cd09a310ae066b65d7548599ee267191c2d9: Status 404 returned error can't find the container with id 2d86189a459e02dd3e39a783ead6cd09a310ae066b65d7548599ee267191c2d9 Nov 29 16:07:24 crc kubenswrapper[4907]: I1129 16:07:24.506393 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e596f733-d46a-4210-9351-50420781dd29" path="/var/lib/kubelet/pods/e596f733-d46a-4210-9351-50420781dd29/volumes" Nov 29 16:07:24 crc kubenswrapper[4907]: I1129 16:07:24.728707 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2tpnk/crc-debug-dk9z7" event={"ID":"feb07f57-3822-43f0-ac31-e21721061bd8","Type":"ContainerStarted","Data":"2d86189a459e02dd3e39a783ead6cd09a310ae066b65d7548599ee267191c2d9"} Nov 29 16:07:25 crc kubenswrapper[4907]: I1129 16:07:25.744225 4907 generic.go:334] "Generic (PLEG): container finished" podID="feb07f57-3822-43f0-ac31-e21721061bd8" containerID="c180829b3456fe63fc342859dbaab2ac11a310e6bb4afdb00d3258ac8813abcf" exitCode=0 Nov 29 16:07:25 crc kubenswrapper[4907]: I1129 16:07:25.744813 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2tpnk/crc-debug-dk9z7" event={"ID":"feb07f57-3822-43f0-ac31-e21721061bd8","Type":"ContainerDied","Data":"c180829b3456fe63fc342859dbaab2ac11a310e6bb4afdb00d3258ac8813abcf"} Nov 29 16:07:25 crc kubenswrapper[4907]: I1129 16:07:25.798877 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2tpnk/crc-debug-dk9z7"] Nov 29 16:07:25 crc kubenswrapper[4907]: I1129 16:07:25.811835 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2tpnk/crc-debug-dk9z7"] Nov 29 16:07:26 crc kubenswrapper[4907]: I1129 16:07:26.869037 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/crc-debug-dk9z7" Nov 29 16:07:26 crc kubenswrapper[4907]: I1129 16:07:26.948344 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/feb07f57-3822-43f0-ac31-e21721061bd8-host\") pod \"feb07f57-3822-43f0-ac31-e21721061bd8\" (UID: \"feb07f57-3822-43f0-ac31-e21721061bd8\") " Nov 29 16:07:26 crc kubenswrapper[4907]: I1129 16:07:26.948430 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/feb07f57-3822-43f0-ac31-e21721061bd8-host" (OuterVolumeSpecName: "host") pod "feb07f57-3822-43f0-ac31-e21721061bd8" (UID: "feb07f57-3822-43f0-ac31-e21721061bd8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 16:07:26 crc kubenswrapper[4907]: I1129 16:07:26.949035 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljk2p\" (UniqueName: \"kubernetes.io/projected/feb07f57-3822-43f0-ac31-e21721061bd8-kube-api-access-ljk2p\") pod \"feb07f57-3822-43f0-ac31-e21721061bd8\" (UID: \"feb07f57-3822-43f0-ac31-e21721061bd8\") " Nov 29 16:07:26 crc kubenswrapper[4907]: I1129 16:07:26.950552 4907 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/feb07f57-3822-43f0-ac31-e21721061bd8-host\") on node \"crc\" DevicePath \"\"" Nov 29 16:07:26 crc kubenswrapper[4907]: I1129 16:07:26.964418 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb07f57-3822-43f0-ac31-e21721061bd8-kube-api-access-ljk2p" (OuterVolumeSpecName: "kube-api-access-ljk2p") pod "feb07f57-3822-43f0-ac31-e21721061bd8" (UID: "feb07f57-3822-43f0-ac31-e21721061bd8"). InnerVolumeSpecName "kube-api-access-ljk2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:07:27 crc kubenswrapper[4907]: I1129 16:07:27.052756 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljk2p\" (UniqueName: \"kubernetes.io/projected/feb07f57-3822-43f0-ac31-e21721061bd8-kube-api-access-ljk2p\") on node \"crc\" DevicePath \"\"" Nov 29 16:07:27 crc kubenswrapper[4907]: I1129 16:07:27.765943 4907 scope.go:117] "RemoveContainer" containerID="c180829b3456fe63fc342859dbaab2ac11a310e6bb4afdb00d3258ac8813abcf" Nov 29 16:07:27 crc kubenswrapper[4907]: I1129 16:07:27.766017 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/crc-debug-dk9z7" Nov 29 16:07:28 crc kubenswrapper[4907]: I1129 16:07:28.490595 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb07f57-3822-43f0-ac31-e21721061bd8" path="/var/lib/kubelet/pods/feb07f57-3822-43f0-ac31-e21721061bd8/volumes" Nov 29 16:07:33 crc kubenswrapper[4907]: I1129 16:07:33.480223 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:07:33 crc kubenswrapper[4907]: E1129 16:07:33.480911 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:07:48 crc kubenswrapper[4907]: I1129 16:07:48.480063 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:07:48 crc kubenswrapper[4907]: E1129 16:07:48.481070 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:07:54 crc kubenswrapper[4907]: I1129 16:07:54.589199 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ceb3061f-2f32-4e89-98f0-628f316bef79/aodh-evaluator/0.log" Nov 29 16:07:54 crc kubenswrapper[4907]: I1129 16:07:54.660493 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ceb3061f-2f32-4e89-98f0-628f316bef79/aodh-api/0.log" Nov 29 16:07:54 crc kubenswrapper[4907]: I1129 16:07:54.817109 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ceb3061f-2f32-4e89-98f0-628f316bef79/aodh-notifier/0.log" Nov 29 16:07:54 crc kubenswrapper[4907]: I1129 16:07:54.820014 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ceb3061f-2f32-4e89-98f0-628f316bef79/aodh-listener/0.log" Nov 29 16:07:54 crc kubenswrapper[4907]: I1129 16:07:54.989274 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c879c8666-gfj6k_d00a5123-088f-4681-81b0-89706e0cb7a8/barbican-api/0.log" Nov 29 16:07:55 crc kubenswrapper[4907]: I1129 16:07:55.060468 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c879c8666-gfj6k_d00a5123-088f-4681-81b0-89706e0cb7a8/barbican-api-log/0.log" Nov 29 16:07:55 crc kubenswrapper[4907]: I1129 16:07:55.135063 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64d7c8d644-rz2mz_f8bb23b2-9a00-4098-b349-ac5221a0d305/barbican-keystone-listener/0.log" Nov 29 16:07:55 crc kubenswrapper[4907]: I1129 16:07:55.305595 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64d7c8d644-rz2mz_f8bb23b2-9a00-4098-b349-ac5221a0d305/barbican-keystone-listener-log/0.log" Nov 29 16:07:55 crc kubenswrapper[4907]: I1129 16:07:55.412210 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-bbc8f7595-4cqhq_27887b0e-b017-4255-a5db-817cc7142898/barbican-worker/0.log" Nov 29 16:07:55 crc kubenswrapper[4907]: I1129 16:07:55.430097 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-bbc8f7595-4cqhq_27887b0e-b017-4255-a5db-817cc7142898/barbican-worker-log/0.log" Nov 29 16:07:55 crc kubenswrapper[4907]: I1129 16:07:55.829604 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz_527f58ea-f7a3-43c4-aeee-b22f40560466/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:07:55 crc kubenswrapper[4907]: I1129 16:07:55.997250 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a/ceilometer-central-agent/0.log" Nov 29 16:07:56 crc kubenswrapper[4907]: I1129 16:07:56.151551 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a/ceilometer-notification-agent/0.log" Nov 29 16:07:56 crc kubenswrapper[4907]: I1129 16:07:56.186595 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a/proxy-httpd/0.log" Nov 29 16:07:56 crc kubenswrapper[4907]: I1129 16:07:56.286361 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a/sg-core/0.log" Nov 29 16:07:56 crc kubenswrapper[4907]: I1129 16:07:56.409195 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b7255a0d-394e-4d14-bc92-327e101b6ed3/cinder-api-log/0.log" Nov 29 16:07:56 crc kubenswrapper[4907]: I1129 16:07:56.463371 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b7255a0d-394e-4d14-bc92-327e101b6ed3/cinder-api/0.log" Nov 29 16:07:56 crc kubenswrapper[4907]: I1129 16:07:56.550343 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dc42ffed-7148-4260-82d7-0b4a2fecc830/cinder-scheduler/0.log" Nov 29 16:07:56 crc kubenswrapper[4907]: I1129 16:07:56.681077 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dc42ffed-7148-4260-82d7-0b4a2fecc830/probe/0.log" Nov 29 16:07:56 crc kubenswrapper[4907]: I1129 16:07:56.793186 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk_1266709f-ede6-4b61-b733-c40852501bb6/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:07:56 crc kubenswrapper[4907]: I1129 16:07:56.945057 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt_711bdb9b-2232-4b77-83b1-8501049d68cc/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:07:57 crc kubenswrapper[4907]: I1129 16:07:57.033823 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-7bd8m_a05a8a5d-2682-419c-abb4-3b4bb8920a68/init/0.log" Nov 29 16:07:57 crc kubenswrapper[4907]: I1129 16:07:57.235941 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-7bd8m_a05a8a5d-2682-419c-abb4-3b4bb8920a68/init/0.log" Nov 29 16:07:57 crc kubenswrapper[4907]: I1129 16:07:57.267635 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww_c36b07ee-345f-4815-8cb9-25085e925d6a/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:07:57 crc kubenswrapper[4907]: I1129 16:07:57.308969 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-7bd8m_a05a8a5d-2682-419c-abb4-3b4bb8920a68/dnsmasq-dns/0.log" Nov 29 16:07:58 crc kubenswrapper[4907]: I1129 16:07:58.217170 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_163933a3-d98e-4701-9124-c821395572eb/glance-httpd/0.log" Nov 29 16:07:58 crc kubenswrapper[4907]: I1129 16:07:58.252319 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_163933a3-d98e-4701-9124-c821395572eb/glance-log/0.log" Nov 29 16:07:58 crc kubenswrapper[4907]: I1129 16:07:58.711998 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0759e100-595c-4f28-8934-25f0a3bb9010/glance-httpd/0.log" Nov 29 16:07:58 crc kubenswrapper[4907]: I1129 16:07:58.722535 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0759e100-595c-4f28-8934-25f0a3bb9010/glance-log/0.log" Nov 29 16:07:59 crc kubenswrapper[4907]: I1129 16:07:59.131327 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-57dd6cc64-twm82_5d44a8a1-d845-407d-8769-8c0ccbebc4d2/heat-engine/0.log" Nov 29 16:07:59 crc kubenswrapper[4907]: I1129 16:07:59.342879 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-l64sn_720a0e4a-f20d-401e-9c04-fd8c001281c3/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:07:59 crc kubenswrapper[4907]: I1129 16:07:59.565600 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5bcf4f8684-6877z_f688c14e-91bc-4525-acd8-a0c9d440dff4/heat-api/0.log" Nov 29 16:07:59 crc kubenswrapper[4907]: I1129 16:07:59.567517 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-7b9cj_79c3b8e5-9000-49cd-a62c-ae366a7592b0/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:07:59 crc kubenswrapper[4907]: I1129 16:07:59.636896 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7b4cb5586d-96kjn_ac083b7c-e604-4aa0-98ed-66668134ad44/heat-cfnapi/0.log" Nov 29 16:08:00 crc kubenswrapper[4907]: I1129 16:08:00.459381 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29407141-rgrbj_55d5a03b-62e0-412d-97d5-99a260862255/keystone-cron/0.log" Nov 29 16:08:00 crc kubenswrapper[4907]: I1129 16:08:00.672672 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29407201-vr9k7_23f66094-97ee-4ca1-9f7c-d435aabea4af/keystone-cron/0.log" Nov 29 16:08:00 crc kubenswrapper[4907]: I1129 16:08:00.701046 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e119dfa1-0a93-4e7a-9b97-8530dbde1fbc/kube-state-metrics/0.log" Nov 29 16:08:00 crc kubenswrapper[4907]: I1129 16:08:00.786322 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76d8ccf675-75wqf_9176edef-f683-4a54-a9b0-3ff55a80347b/keystone-api/0.log" Nov 29 16:08:00 crc kubenswrapper[4907]: I1129 16:08:00.912847 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-fprns_49802986-4e60-418a-9c0b-5263ebef0944/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:08:00 crc kubenswrapper[4907]: I1129 16:08:00.959641 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-tmrrg_f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5/logging-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:08:01 crc kubenswrapper[4907]: I1129 16:08:01.185590 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_406365ac-b529-4ca8-be52-8b802da87feb/mysqld-exporter/0.log" Nov 29 16:08:01 crc kubenswrapper[4907]: I1129 16:08:01.479579 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:08:01 crc kubenswrapper[4907]: E1129 16:08:01.479871 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:08:01 crc kubenswrapper[4907]: I1129 16:08:01.514120 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr_dd16d4b2-3a9d-4a05-8564-3de313928ab8/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:08:01 crc kubenswrapper[4907]: I1129 16:08:01.554022 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-68c5f6d545-tlmv5_e814e290-11f9-48bc-9f3d-36aeecf0ec1a/neutron-api/0.log" Nov 29 16:08:01 crc kubenswrapper[4907]: I1129 16:08:01.563201 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-68c5f6d545-tlmv5_e814e290-11f9-48bc-9f3d-36aeecf0ec1a/neutron-httpd/0.log" Nov 29 16:08:02 crc kubenswrapper[4907]: I1129 16:08:02.157987 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a21e4e8e-a729-4641-9c99-c022eb3ca6a8/nova-cell0-conductor-conductor/0.log" Nov 29 16:08:02 crc kubenswrapper[4907]: I1129 16:08:02.369954 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3bf63a4b-e850-4b1c-b7ea-00bf87d3d125/nova-api-log/0.log" Nov 29 16:08:02 crc kubenswrapper[4907]: I1129 16:08:02.612812 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3d959bbb-e174-4315-935c-18f5cc65008c/nova-cell1-conductor-conductor/0.log" Nov 29 16:08:02 crc kubenswrapper[4907]: I1129 16:08:02.842471 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3bf63a4b-e850-4b1c-b7ea-00bf87d3d125/nova-api-api/0.log" Nov 29 16:08:02 crc kubenswrapper[4907]: I1129 16:08:02.921807 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9a20c0df-80ec-4e00-9bec-409327ec2c90/nova-cell1-novncproxy-novncproxy/0.log" Nov 29 16:08:03 crc kubenswrapper[4907]: I1129 16:08:03.049933 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-z8mj5_9af53af7-2ded-4b44-92c8-85cb98ea6519/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:08:03 crc kubenswrapper[4907]: I1129 16:08:03.233340 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8e651a72-97da-438c-9791-42506da10f6f/nova-metadata-log/0.log" Nov 29 16:08:03 crc kubenswrapper[4907]: I1129 16:08:03.596246 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e9f58cda-06bb-41f5-b91d-fdf10dab6164/mysql-bootstrap/0.log" Nov 29 16:08:03 crc kubenswrapper[4907]: I1129 16:08:03.616497 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ad18f82a-09a8-4a8c-ae4b-677fa5dd280d/nova-scheduler-scheduler/0.log" Nov 29 16:08:03 crc kubenswrapper[4907]: I1129 16:08:03.844854 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e9f58cda-06bb-41f5-b91d-fdf10dab6164/mysql-bootstrap/0.log" Nov 29 16:08:03 crc kubenswrapper[4907]: I1129 16:08:03.948343 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e9f58cda-06bb-41f5-b91d-fdf10dab6164/galera/0.log" Nov 29 16:08:04 crc kubenswrapper[4907]: I1129 16:08:04.084803 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2b4712d7-a81b-455f-841a-a0ca14eafcbe/mysql-bootstrap/0.log" Nov 29 16:08:04 crc kubenswrapper[4907]: I1129 16:08:04.266141 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2b4712d7-a81b-455f-841a-a0ca14eafcbe/galera/0.log" Nov 29 16:08:04 crc kubenswrapper[4907]: I1129 16:08:04.280676 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2b4712d7-a81b-455f-841a-a0ca14eafcbe/mysql-bootstrap/0.log" Nov 29 16:08:04 crc kubenswrapper[4907]: I1129 16:08:04.545533 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_71aeb8b9-6bde-4a3e-a6f1-6d7c192490be/openstackclient/0.log" Nov 29 16:08:04 crc kubenswrapper[4907]: I1129 16:08:04.565698 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-m4mz2_33f5965b-43ae-484d-9c5c-1a54ae4de6da/ovn-controller/0.log" Nov 29 16:08:04 crc kubenswrapper[4907]: I1129 16:08:04.807111 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-225jx_79e24485-836a-4a5d-a183-2f8dc0de5c07/openstack-network-exporter/0.log" Nov 29 16:08:05 crc kubenswrapper[4907]: I1129 16:08:05.036488 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ndrfl_b3d93208-e155-4746-bfd4-2d6d7d04dc2e/ovsdb-server-init/0.log" Nov 29 16:08:05 crc kubenswrapper[4907]: I1129 16:08:05.264229 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ndrfl_b3d93208-e155-4746-bfd4-2d6d7d04dc2e/ovsdb-server-init/0.log" Nov 29 16:08:05 crc kubenswrapper[4907]: I1129 16:08:05.321100 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ndrfl_b3d93208-e155-4746-bfd4-2d6d7d04dc2e/ovsdb-server/0.log" Nov 29 16:08:05 crc kubenswrapper[4907]: I1129 16:08:05.348990 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ndrfl_b3d93208-e155-4746-bfd4-2d6d7d04dc2e/ovs-vswitchd/0.log" Nov 29 16:08:05 crc kubenswrapper[4907]: I1129 16:08:05.548067 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-88bdf_a0dbf497-7f0c-4aaf-841d-7abbe8299bd9/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:08:05 crc kubenswrapper[4907]: I1129 16:08:05.710423 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8e651a72-97da-438c-9791-42506da10f6f/nova-metadata-metadata/0.log" Nov 29 16:08:05 crc kubenswrapper[4907]: I1129 16:08:05.718365 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fc8b80af-94d6-4c43-887b-07aafa877200/openstack-network-exporter/0.log" Nov 29 16:08:05 crc kubenswrapper[4907]: I1129 16:08:05.770208 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fc8b80af-94d6-4c43-887b-07aafa877200/ovn-northd/0.log" Nov 29 16:08:05 crc kubenswrapper[4907]: I1129 16:08:05.919520 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_76797dae-1bc6-4e63-824b-423fab640187/openstack-network-exporter/0.log" Nov 29 16:08:05 crc kubenswrapper[4907]: I1129 16:08:05.969004 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_76797dae-1bc6-4e63-824b-423fab640187/ovsdbserver-nb/0.log" Nov 29 16:08:06 crc kubenswrapper[4907]: I1129 16:08:06.372503 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7c6489ed-0658-49ec-8ae2-a43a8cf795ef/openstack-network-exporter/0.log" Nov 29 16:08:06 crc kubenswrapper[4907]: I1129 16:08:06.475003 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7c6489ed-0658-49ec-8ae2-a43a8cf795ef/ovsdbserver-sb/0.log" Nov 29 16:08:06 crc kubenswrapper[4907]: I1129 16:08:06.619872 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65859db6b4-hwsds_d47b748c-ba00-496f-83d0-45aaa1049423/placement-api/0.log" Nov 29 16:08:06 crc kubenswrapper[4907]: I1129 16:08:06.719220 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b794f06f-38a0-4c4d-933b-db50f05ddfb8/init-config-reloader/0.log" Nov 29 16:08:06 crc kubenswrapper[4907]: I1129 16:08:06.817251 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65859db6b4-hwsds_d47b748c-ba00-496f-83d0-45aaa1049423/placement-log/0.log" Nov 29 16:08:06 crc kubenswrapper[4907]: I1129 16:08:06.964698 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b794f06f-38a0-4c4d-933b-db50f05ddfb8/init-config-reloader/0.log" Nov 29 16:08:06 crc kubenswrapper[4907]: I1129 16:08:06.993546 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b794f06f-38a0-4c4d-933b-db50f05ddfb8/prometheus/0.log" Nov 29 16:08:07 crc kubenswrapper[4907]: I1129 16:08:07.019315 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b794f06f-38a0-4c4d-933b-db50f05ddfb8/config-reloader/0.log" Nov 29 16:08:07 crc kubenswrapper[4907]: I1129 16:08:07.049764 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b794f06f-38a0-4c4d-933b-db50f05ddfb8/thanos-sidecar/0.log" Nov 29 16:08:07 crc kubenswrapper[4907]: I1129 16:08:07.194339 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_07559e4d-3526-441a-a08d-e11c60e80761/setup-container/0.log" Nov 29 16:08:07 crc kubenswrapper[4907]: I1129 16:08:07.472978 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63f606f9-1313-4d39-8f54-78078cbd256e/setup-container/0.log" Nov 29 16:08:07 crc kubenswrapper[4907]: I1129 16:08:07.486365 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_07559e4d-3526-441a-a08d-e11c60e80761/setup-container/0.log" Nov 29 16:08:07 crc kubenswrapper[4907]: I1129 16:08:07.550870 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_07559e4d-3526-441a-a08d-e11c60e80761/rabbitmq/0.log" Nov 29 16:08:07 crc kubenswrapper[4907]: I1129 16:08:07.773880 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63f606f9-1313-4d39-8f54-78078cbd256e/setup-container/0.log" Nov 29 16:08:07 crc kubenswrapper[4907]: I1129 16:08:07.837186 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63f606f9-1313-4d39-8f54-78078cbd256e/rabbitmq/0.log" Nov 29 16:08:07 crc kubenswrapper[4907]: I1129 16:08:07.878087 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd_ad0ddb73-0774-41ea-999b-a915a2d0f5cd/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:08:08 crc kubenswrapper[4907]: I1129 16:08:08.010000 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_828d6b05-06be-4157-8163-96a3220fedb0/memcached/0.log" Nov 29 16:08:08 crc kubenswrapper[4907]: I1129 16:08:08.077765 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-vsb6p_36c11dee-b610-4a94-937c-63b049c54f14/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:08:08 crc kubenswrapper[4907]: I1129 16:08:08.152611 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj_6af6aa2c-2dee-441f-8607-cd1aec4d6fc3/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:08:08 crc kubenswrapper[4907]: I1129 16:08:08.233101 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-d9sbv_17f0e654-fdc3-4250-9e2e-bf7cb21e7175/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:08:09 crc kubenswrapper[4907]: I1129 16:08:09.102933 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-5tzc9_d3aef26e-6dd9-447d-b445-09b8c9b80935/ssh-known-hosts-edpm-deployment/0.log" Nov 29 16:08:09 crc kubenswrapper[4907]: I1129 16:08:09.260814 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6c8fc64d77-lnt4r_89877a72-fedb-44ba-abe3-f74344119594/proxy-server/0.log" Nov 29 16:08:09 crc kubenswrapper[4907]: I1129 16:08:09.298289 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6c8fc64d77-lnt4r_89877a72-fedb-44ba-abe3-f74344119594/proxy-httpd/0.log" Nov 29 16:08:09 crc kubenswrapper[4907]: I1129 16:08:09.341299 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-nfb2t_b5bad7a6-9301-4f9e-8303-ae377c4f909f/swift-ring-rebalance/0.log" Nov 29 16:08:09 crc kubenswrapper[4907]: I1129 16:08:09.476591 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/account-auditor/0.log" Nov 29 16:08:09 crc kubenswrapper[4907]: I1129 16:08:09.507566 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/account-reaper/0.log" Nov 29 16:08:09 crc kubenswrapper[4907]: I1129 16:08:09.638289 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/account-replicator/0.log" Nov 29 16:08:09 crc kubenswrapper[4907]: I1129 16:08:09.765906 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/account-server/0.log" Nov 29 16:08:09 crc kubenswrapper[4907]: I1129 16:08:09.859603 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/container-auditor/0.log" Nov 29 16:08:09 crc kubenswrapper[4907]: I1129 16:08:09.878128 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/container-replicator/0.log" Nov 29 16:08:09 crc kubenswrapper[4907]: I1129 16:08:09.962068 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/container-server/0.log" Nov 29 16:08:09 crc kubenswrapper[4907]: I1129 16:08:09.972380 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/object-auditor/0.log" Nov 29 16:08:10 crc kubenswrapper[4907]: I1129 16:08:10.007034 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/container-updater/0.log" Nov 29 16:08:10 crc kubenswrapper[4907]: I1129 16:08:10.077141 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/object-expirer/0.log" Nov 29 16:08:10 crc kubenswrapper[4907]: I1129 16:08:10.119620 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/object-replicator/0.log" Nov 29 16:08:10 crc kubenswrapper[4907]: I1129 16:08:10.196208 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/object-server/0.log" Nov 29 16:08:10 crc kubenswrapper[4907]: I1129 16:08:10.217230 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/rsync/0.log" Nov 29 16:08:10 crc kubenswrapper[4907]: I1129 16:08:10.221239 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/object-updater/0.log" Nov 29 16:08:11 crc kubenswrapper[4907]: I1129 16:08:11.168377 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4_459ecb90-0260-49a5-a146-fb948f9daefb/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:08:11 crc kubenswrapper[4907]: I1129 16:08:11.187510 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/swift-recon-cron/0.log" Nov 29 16:08:11 crc kubenswrapper[4907]: I1129 16:08:11.275785 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b_3ce1e052-0764-4391-9bb8-149e06b8744a/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:08:11 crc kubenswrapper[4907]: I1129 16:08:11.494716 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_520ecac5-d7f0-4863-8d88-4789fcad831a/test-operator-logs-container/0.log" Nov 29 16:08:11 crc kubenswrapper[4907]: I1129 16:08:11.682415 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-d76lm_5661a247-cd8b-4001-bbf9-841c52c59abc/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:08:11 crc kubenswrapper[4907]: I1129 16:08:11.889841 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e689bb5a-7b28-48c6-995f-bc0dc07078de/tempest-tests-tempest-tests-runner/0.log" Nov 29 16:08:12 crc kubenswrapper[4907]: I1129 16:08:12.489958 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:08:12 crc kubenswrapper[4907]: E1129 16:08:12.490276 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:08:26 crc kubenswrapper[4907]: I1129 16:08:26.483238 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:08:26 crc kubenswrapper[4907]: E1129 16:08:26.484115 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:08:37 crc kubenswrapper[4907]: I1129 16:08:37.480298 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:08:37 crc kubenswrapper[4907]: E1129 16:08:37.481269 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:08:43 crc kubenswrapper[4907]: I1129 16:08:43.456923 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s_089ab608-2dbf-489d-bcc8-cb61ab4564b4/util/0.log" Nov 29 16:08:44 crc kubenswrapper[4907]: I1129 16:08:44.335687 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s_089ab608-2dbf-489d-bcc8-cb61ab4564b4/pull/0.log" Nov 29 16:08:44 crc kubenswrapper[4907]: I1129 16:08:44.386321 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s_089ab608-2dbf-489d-bcc8-cb61ab4564b4/util/0.log" Nov 29 16:08:44 crc kubenswrapper[4907]: I1129 16:08:44.413834 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s_089ab608-2dbf-489d-bcc8-cb61ab4564b4/pull/0.log" Nov 29 16:08:44 crc kubenswrapper[4907]: I1129 16:08:44.576005 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s_089ab608-2dbf-489d-bcc8-cb61ab4564b4/util/0.log" Nov 29 16:08:44 crc kubenswrapper[4907]: I1129 16:08:44.633091 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s_089ab608-2dbf-489d-bcc8-cb61ab4564b4/extract/0.log" Nov 29 16:08:44 crc kubenswrapper[4907]: I1129 16:08:44.663448 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s_089ab608-2dbf-489d-bcc8-cb61ab4564b4/pull/0.log" Nov 29 16:08:44 crc kubenswrapper[4907]: I1129 16:08:44.796868 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-b8f6h_958f375f-e7a8-4d96-b2a1-dc5a63cdc865/kube-rbac-proxy/0.log" Nov 29 16:08:44 crc kubenswrapper[4907]: I1129 16:08:44.871282 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-b8f6h_958f375f-e7a8-4d96-b2a1-dc5a63cdc865/manager/0.log" Nov 29 16:08:44 crc kubenswrapper[4907]: I1129 16:08:44.902601 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-kqbx2_71e0b5bc-68d6-434d-97c4-0c6d3a324e15/kube-rbac-proxy/0.log" Nov 29 16:08:45 crc kubenswrapper[4907]: I1129 16:08:45.042079 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-kqbx2_71e0b5bc-68d6-434d-97c4-0c6d3a324e15/manager/0.log" Nov 29 16:08:45 crc kubenswrapper[4907]: I1129 16:08:45.118126 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-brs2h_cdcaa6fe-2208-49d5-82d8-9b2c96be251d/kube-rbac-proxy/0.log" Nov 29 16:08:45 crc kubenswrapper[4907]: I1129 16:08:45.132555 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-brs2h_cdcaa6fe-2208-49d5-82d8-9b2c96be251d/manager/0.log" Nov 29 16:08:45 crc kubenswrapper[4907]: I1129 16:08:45.328596 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-28hrq_aca0ecce-183f-40cd-8ab0-aed5caf29556/kube-rbac-proxy/0.log" Nov 29 16:08:45 crc kubenswrapper[4907]: I1129 16:08:45.368138 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-28hrq_aca0ecce-183f-40cd-8ab0-aed5caf29556/manager/0.log" Nov 29 16:08:45 crc kubenswrapper[4907]: I1129 16:08:45.454572 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-28vdp_bbc59bc4-78c2-4534-b1bd-93cf4b60f86e/kube-rbac-proxy/0.log" Nov 29 16:08:45 crc kubenswrapper[4907]: I1129 16:08:45.592536 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-28vdp_bbc59bc4-78c2-4534-b1bd-93cf4b60f86e/manager/0.log" Nov 29 16:08:45 crc kubenswrapper[4907]: I1129 16:08:45.652138 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-wspkj_cf7efbf1-79c8-45f9-8bed-7a33f47226ef/kube-rbac-proxy/0.log" Nov 29 16:08:45 crc kubenswrapper[4907]: I1129 16:08:45.726888 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-wspkj_cf7efbf1-79c8-45f9-8bed-7a33f47226ef/manager/0.log" Nov 29 16:08:45 crc kubenswrapper[4907]: I1129 16:08:45.802355 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-z92f7_f9c64e5e-531f-4f11-b7d5-e22ed46b9b86/kube-rbac-proxy/0.log" Nov 29 16:08:46 crc kubenswrapper[4907]: I1129 16:08:46.032773 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-z92f7_f9c64e5e-531f-4f11-b7d5-e22ed46b9b86/manager/0.log" Nov 29 16:08:46 crc kubenswrapper[4907]: I1129 16:08:46.073173 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-xqw5l_e12e8dfe-6b2d-49f4-90d1-3165ec08f043/kube-rbac-proxy/0.log" Nov 29 16:08:46 crc kubenswrapper[4907]: I1129 16:08:46.110647 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-xqw5l_e12e8dfe-6b2d-49f4-90d1-3165ec08f043/manager/0.log" Nov 29 16:08:46 crc kubenswrapper[4907]: I1129 16:08:46.291295 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-kqq5w_d9c5b591-4e0f-4f9e-930d-070798fccb44/manager/0.log" Nov 29 16:08:46 crc kubenswrapper[4907]: I1129 16:08:46.311780 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-kqq5w_d9c5b591-4e0f-4f9e-930d-070798fccb44/kube-rbac-proxy/0.log" Nov 29 16:08:46 crc kubenswrapper[4907]: I1129 16:08:46.458758 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-zspmk_46a74794-f3b0-4bf0-9c94-0920441fd3ce/kube-rbac-proxy/0.log" Nov 29 16:08:46 crc kubenswrapper[4907]: I1129 16:08:46.491559 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-zspmk_46a74794-f3b0-4bf0-9c94-0920441fd3ce/manager/0.log" Nov 29 16:08:46 crc kubenswrapper[4907]: I1129 16:08:46.593731 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-4dlgt_2b9ca9f5-7979-47ef-9e37-f9c519b57445/kube-rbac-proxy/0.log" Nov 29 16:08:46 crc kubenswrapper[4907]: I1129 16:08:46.697059 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-4dlgt_2b9ca9f5-7979-47ef-9e37-f9c519b57445/manager/0.log" Nov 29 16:08:46 crc kubenswrapper[4907]: I1129 16:08:46.770925 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-jm57w_103b9723-75c0-41ca-8264-41912d22a5cb/kube-rbac-proxy/0.log" Nov 29 16:08:46 crc kubenswrapper[4907]: I1129 16:08:46.867984 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-jm57w_103b9723-75c0-41ca-8264-41912d22a5cb/manager/0.log" Nov 29 16:08:46 crc kubenswrapper[4907]: I1129 16:08:46.982374 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-p8qdf_3cbe9b24-61e0-449f-a91f-289fd9c5de8e/kube-rbac-proxy/0.log" Nov 29 16:08:47 crc kubenswrapper[4907]: I1129 16:08:47.061512 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-p8qdf_3cbe9b24-61e0-449f-a91f-289fd9c5de8e/manager/0.log" Nov 29 16:08:47 crc kubenswrapper[4907]: I1129 16:08:47.184171 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-qhggl_8c2b8c08-4eb0-4a87-bb1f-87e08ff1aa91/kube-rbac-proxy/0.log" Nov 29 16:08:47 crc kubenswrapper[4907]: I1129 16:08:47.224535 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-qhggl_8c2b8c08-4eb0-4a87-bb1f-87e08ff1aa91/manager/0.log" Nov 29 16:08:47 crc kubenswrapper[4907]: I1129 16:08:47.236711 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd_711dd79a-4219-43a3-9767-aad244b9c68f/kube-rbac-proxy/0.log" Nov 29 16:08:47 crc kubenswrapper[4907]: I1129 16:08:47.298918 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd_711dd79a-4219-43a3-9767-aad244b9c68f/manager/0.log" Nov 29 16:08:47 crc kubenswrapper[4907]: I1129 16:08:47.577264 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-95b97cc44-vpslq_bd26c3e2-de76-4342-91c6-9ee4571f8619/operator/0.log" Nov 29 16:08:47 crc kubenswrapper[4907]: I1129 16:08:47.607127 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-2m2ns_ca80acab-472d-46fd-97c1-f432ddf7bb64/registry-server/0.log" Nov 29 16:08:47 crc kubenswrapper[4907]: I1129 16:08:47.896667 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-gs88x_7f331ad4-d753-41f4-82f9-c2bd60806987/kube-rbac-proxy/0.log" Nov 29 16:08:47 crc kubenswrapper[4907]: I1129 16:08:47.946966 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-gs88x_7f331ad4-d753-41f4-82f9-c2bd60806987/manager/0.log" Nov 29 16:08:48 crc kubenswrapper[4907]: I1129 16:08:48.200722 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-fdjx8_d193cf7e-774f-44b3-ae22-090d09c15ba5/kube-rbac-proxy/0.log" Nov 29 16:08:48 crc kubenswrapper[4907]: I1129 16:08:48.434894 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-kd564_ec0d115a-0b4b-4691-b4f4-778ffd7f6219/operator/0.log" Nov 29 16:08:48 crc kubenswrapper[4907]: I1129 16:08:48.471153 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-fdjx8_d193cf7e-774f-44b3-ae22-090d09c15ba5/manager/0.log" Nov 29 16:08:48 crc kubenswrapper[4907]: I1129 16:08:48.594723 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-f5c5f9868-d5gtz_e1f880b2-0e04-48c5-81ca-0103abd439fe/manager/0.log" Nov 29 16:08:48 crc kubenswrapper[4907]: I1129 16:08:48.659223 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-h9tkn_fd726e7f-5139-4eb6-b18d-24d14682648c/kube-rbac-proxy/0.log" Nov 29 16:08:48 crc kubenswrapper[4907]: I1129 16:08:48.712840 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-h9tkn_fd726e7f-5139-4eb6-b18d-24d14682648c/manager/0.log" Nov 29 16:08:48 crc kubenswrapper[4907]: I1129 16:08:48.754503 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-86bbb9c7fb-ldhkh_dceea103-7394-4d01-9168-0b3f5b49306f/kube-rbac-proxy/0.log" Nov 29 16:08:48 crc kubenswrapper[4907]: I1129 16:08:48.980081 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-86bbb9c7fb-ldhkh_dceea103-7394-4d01-9168-0b3f5b49306f/manager/0.log" Nov 29 16:08:48 crc kubenswrapper[4907]: I1129 16:08:48.990735 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-6h9pm_418d0e3c-7354-4e14-b17b-bab93518e78b/kube-rbac-proxy/0.log" Nov 29 16:08:48 crc kubenswrapper[4907]: I1129 16:08:48.991756 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-6h9pm_418d0e3c-7354-4e14-b17b-bab93518e78b/manager/0.log" Nov 29 16:08:49 crc kubenswrapper[4907]: I1129 16:08:49.157075 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-dpdcs_45db6747-0449-4839-b0ed-07e930579b83/kube-rbac-proxy/0.log" Nov 29 16:08:49 crc kubenswrapper[4907]: I1129 16:08:49.168450 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-dpdcs_45db6747-0449-4839-b0ed-07e930579b83/manager/0.log" Nov 29 16:08:52 crc kubenswrapper[4907]: I1129 16:08:52.494556 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:08:52 crc kubenswrapper[4907]: E1129 16:08:52.495365 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:09:03 crc kubenswrapper[4907]: I1129 16:09:03.479887 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:09:03 crc kubenswrapper[4907]: E1129 16:09:03.480599 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:09:14 crc kubenswrapper[4907]: I1129 16:09:14.970405 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-75hhz_0c5dabfe-62e3-4104-9939-59e4832c6484/control-plane-machine-set-operator/0.log" Nov 29 16:09:15 crc kubenswrapper[4907]: I1129 16:09:15.125087 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jcdhm_d5769473-e380-4d0e-bfe4-aab057473a62/kube-rbac-proxy/0.log" Nov 29 16:09:15 crc kubenswrapper[4907]: I1129 16:09:15.126021 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jcdhm_d5769473-e380-4d0e-bfe4-aab057473a62/machine-api-operator/0.log" Nov 29 16:09:16 crc kubenswrapper[4907]: I1129 16:09:16.480838 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:09:16 crc kubenswrapper[4907]: E1129 16:09:16.481528 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:09:29 crc kubenswrapper[4907]: I1129 16:09:29.479605 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:09:29 crc kubenswrapper[4907]: I1129 16:09:29.895693 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-j4z4g_b2590975-d739-4401-ae0d-8ef8dd6ba179/cert-manager-controller/0.log" Nov 29 16:09:30 crc kubenswrapper[4907]: I1129 16:09:30.103289 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-ps9pn_a1af2f3d-c619-4288-93c6-721cb89dc1cf/cert-manager-cainjector/0.log" Nov 29 16:09:30 crc kubenswrapper[4907]: I1129 16:09:30.239410 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-79pdn_11b5343c-652a-4d02-841e-2373f5b9f0cf/cert-manager-webhook/0.log" Nov 29 16:09:30 crc kubenswrapper[4907]: I1129 16:09:30.412330 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"5c9e246176adde8029f4d2d0175676987ac7f82b41e64c176c55b0095d1e178e"} Nov 29 16:09:45 crc kubenswrapper[4907]: I1129 16:09:45.272035 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-r4k89_25f5423c-ea17-4f48-9552-7012ca67b559/nmstate-console-plugin/0.log" Nov 29 16:09:45 crc kubenswrapper[4907]: I1129 16:09:45.419118 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-w6v2c_3cc3523e-560b-4af9-a232-0c37f3343fac/nmstate-handler/0.log" Nov 29 16:09:45 crc kubenswrapper[4907]: I1129 16:09:45.503891 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-wrbbp_2ce41195-9d9e-43aa-b0e7-77dbe09cc4cf/kube-rbac-proxy/0.log" Nov 29 16:09:45 crc kubenswrapper[4907]: I1129 16:09:45.542489 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-wrbbp_2ce41195-9d9e-43aa-b0e7-77dbe09cc4cf/nmstate-metrics/0.log" Nov 29 16:09:45 crc kubenswrapper[4907]: I1129 16:09:45.689340 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-p8tcl_19da125c-061c-4051-853f-38e13d9a6d5f/nmstate-operator/0.log" Nov 29 16:09:45 crc kubenswrapper[4907]: I1129 16:09:45.769933 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-s4p8c_b0562e46-01ba-4930-a99f-92771a1804a9/nmstate-webhook/0.log" Nov 29 16:10:02 crc kubenswrapper[4907]: I1129 16:10:02.290325 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6ddbc98977-wnwpz_faed25bd-9bb2-4409-927a-e70521fb534c/kube-rbac-proxy/0.log" Nov 29 16:10:02 crc kubenswrapper[4907]: I1129 16:10:02.343418 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6ddbc98977-wnwpz_faed25bd-9bb2-4409-927a-e70521fb534c/manager/0.log" Nov 29 16:10:20 crc kubenswrapper[4907]: I1129 16:10:20.200865 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-22svz_75775bda-f952-44db-a0c1-01993254453f/cluster-logging-operator/0.log" Nov 29 16:10:20 crc kubenswrapper[4907]: I1129 16:10:20.969873 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_1395c265-394b-4f9d-9bab-ebdff563a7b2/loki-compactor/0.log" Nov 29 16:10:21 crc kubenswrapper[4907]: I1129 16:10:21.005787 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-45v85_6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a/collector/0.log" Nov 29 16:10:21 crc kubenswrapper[4907]: I1129 16:10:21.180677 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-fxc7c_54d3a7f1-ba2a-4744-937a-4bf219bb85ab/loki-distributor/0.log" Nov 29 16:10:21 crc kubenswrapper[4907]: I1129 16:10:21.228351 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-586bf9b9f5-7865w_53107106-32ab-4c46-949f-094abb62ce68/gateway/0.log" Nov 29 16:10:21 crc kubenswrapper[4907]: I1129 16:10:21.231630 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-586bf9b9f5-7865w_53107106-32ab-4c46-949f-094abb62ce68/opa/0.log" Nov 29 16:10:21 crc kubenswrapper[4907]: I1129 16:10:21.398778 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-586bf9b9f5-wlw8p_fb80f35c-6683-4296-afd9-a0895e860a3d/gateway/0.log" Nov 29 16:10:21 crc kubenswrapper[4907]: I1129 16:10:21.448558 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-586bf9b9f5-wlw8p_fb80f35c-6683-4296-afd9-a0895e860a3d/opa/0.log" Nov 29 16:10:21 crc kubenswrapper[4907]: I1129 16:10:21.599407 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_d24d8eb0-2f0a-41f7-9234-2ae2bab4b191/loki-index-gateway/0.log" Nov 29 16:10:21 crc kubenswrapper[4907]: I1129 16:10:21.665431 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_7e05f45d-0f5c-45a0-81cb-673104c0f806/loki-ingester/0.log" Nov 29 16:10:21 crc kubenswrapper[4907]: I1129 16:10:21.835386 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-dmfpd_53bdaeef-1d57-48e5-8b2d-bc9edacd5351/loki-querier/0.log" Nov 29 16:10:21 crc kubenswrapper[4907]: I1129 16:10:21.894082 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-xpnnl_34885590-0043-44b5-be42-f726d65f8487/loki-query-frontend/0.log" Nov 29 16:10:36 crc kubenswrapper[4907]: I1129 16:10:36.409200 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-bkm82_a13cb44c-0bae-4a00-9f98-ad5c6f3c6660/kube-rbac-proxy/0.log" Nov 29 16:10:36 crc kubenswrapper[4907]: I1129 16:10:36.480516 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-bkm82_a13cb44c-0bae-4a00-9f98-ad5c6f3c6660/controller/0.log" Nov 29 16:10:36 crc kubenswrapper[4907]: I1129 16:10:36.621420 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-frr-files/0.log" Nov 29 16:10:36 crc kubenswrapper[4907]: I1129 16:10:36.815869 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-frr-files/0.log" Nov 29 16:10:36 crc kubenswrapper[4907]: I1129 16:10:36.845664 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-metrics/0.log" Nov 29 16:10:36 crc kubenswrapper[4907]: I1129 16:10:36.864045 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-reloader/0.log" Nov 29 16:10:36 crc kubenswrapper[4907]: I1129 16:10:36.872718 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-reloader/0.log" Nov 29 16:10:37 crc kubenswrapper[4907]: I1129 16:10:37.101313 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-frr-files/0.log" Nov 29 16:10:37 crc kubenswrapper[4907]: I1129 16:10:37.101376 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-reloader/0.log" Nov 29 16:10:37 crc kubenswrapper[4907]: I1129 16:10:37.110282 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-metrics/0.log" Nov 29 16:10:37 crc kubenswrapper[4907]: I1129 16:10:37.130579 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-metrics/0.log" Nov 29 16:10:37 crc kubenswrapper[4907]: I1129 16:10:37.299026 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-frr-files/0.log" Nov 29 16:10:37 crc kubenswrapper[4907]: I1129 16:10:37.345246 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-metrics/0.log" Nov 29 16:10:37 crc kubenswrapper[4907]: I1129 16:10:37.350138 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-reloader/0.log" Nov 29 16:10:37 crc kubenswrapper[4907]: I1129 16:10:37.374795 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/controller/0.log" Nov 29 16:10:37 crc kubenswrapper[4907]: I1129 16:10:37.523038 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/frr-metrics/0.log" Nov 29 16:10:37 crc kubenswrapper[4907]: I1129 16:10:37.573149 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/kube-rbac-proxy-frr/0.log" Nov 29 16:10:37 crc kubenswrapper[4907]: I1129 16:10:37.577553 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/kube-rbac-proxy/0.log" Nov 29 16:10:37 crc kubenswrapper[4907]: I1129 16:10:37.756273 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/reloader/0.log" Nov 29 16:10:37 crc kubenswrapper[4907]: I1129 16:10:37.828902 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-tdzhz_749d56ce-a6c4-4b8f-bd45-0f8a44a9d192/frr-k8s-webhook-server/0.log" Nov 29 16:10:37 crc kubenswrapper[4907]: I1129 16:10:37.965574 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6dfdbf684f-xmtsv_89e52ee4-247d-402e-9c42-8f39e8529314/manager/0.log" Nov 29 16:10:38 crc kubenswrapper[4907]: I1129 16:10:38.204546 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-785f7fb488-nl5qd_008f37e0-a6cb-4202-aed6-fa2b3734e881/webhook-server/0.log" Nov 29 16:10:38 crc kubenswrapper[4907]: I1129 16:10:38.404575 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-csdjw_a0198f8f-d4b9-4452-abda-d3e0df0ec26d/kube-rbac-proxy/0.log" Nov 29 16:10:38 crc kubenswrapper[4907]: I1129 16:10:38.935455 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-csdjw_a0198f8f-d4b9-4452-abda-d3e0df0ec26d/speaker/0.log" Nov 29 16:10:39 crc kubenswrapper[4907]: I1129 16:10:39.193527 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/frr/0.log" Nov 29 16:10:55 crc kubenswrapper[4907]: I1129 16:10:55.884472 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p_d1016dfd-9651-4c1f-94f4-312c4eab6a00/util/0.log" Nov 29 16:10:56 crc kubenswrapper[4907]: I1129 16:10:56.065050 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p_d1016dfd-9651-4c1f-94f4-312c4eab6a00/util/0.log" Nov 29 16:10:56 crc kubenswrapper[4907]: I1129 16:10:56.107134 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p_d1016dfd-9651-4c1f-94f4-312c4eab6a00/pull/0.log" Nov 29 16:10:56 crc kubenswrapper[4907]: I1129 16:10:56.107196 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p_d1016dfd-9651-4c1f-94f4-312c4eab6a00/pull/0.log" Nov 29 16:10:56 crc kubenswrapper[4907]: I1129 16:10:56.252168 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p_d1016dfd-9651-4c1f-94f4-312c4eab6a00/pull/0.log" Nov 29 16:10:56 crc kubenswrapper[4907]: I1129 16:10:56.292531 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p_d1016dfd-9651-4c1f-94f4-312c4eab6a00/extract/0.log" Nov 29 16:10:56 crc kubenswrapper[4907]: I1129 16:10:56.336511 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p_d1016dfd-9651-4c1f-94f4-312c4eab6a00/util/0.log" Nov 29 16:10:56 crc kubenswrapper[4907]: I1129 16:10:56.436022 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb_1e95c092-9faa-432a-8f5b-b4a831e12946/util/0.log" Nov 29 16:10:57 crc kubenswrapper[4907]: I1129 16:10:57.447078 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb_1e95c092-9faa-432a-8f5b-b4a831e12946/util/0.log" Nov 29 16:10:57 crc kubenswrapper[4907]: I1129 16:10:57.492392 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb_1e95c092-9faa-432a-8f5b-b4a831e12946/pull/0.log" Nov 29 16:10:57 crc kubenswrapper[4907]: I1129 16:10:57.505682 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb_1e95c092-9faa-432a-8f5b-b4a831e12946/pull/0.log" Nov 29 16:10:57 crc kubenswrapper[4907]: I1129 16:10:57.752450 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb_1e95c092-9faa-432a-8f5b-b4a831e12946/extract/0.log" Nov 29 16:10:57 crc kubenswrapper[4907]: I1129 16:10:57.779516 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb_1e95c092-9faa-432a-8f5b-b4a831e12946/util/0.log" Nov 29 16:10:57 crc kubenswrapper[4907]: I1129 16:10:57.790340 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb_1e95c092-9faa-432a-8f5b-b4a831e12946/pull/0.log" Nov 29 16:10:57 crc kubenswrapper[4907]: I1129 16:10:57.958026 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn_90303a73-fb9d-454b-a241-ffacdb554862/util/0.log" Nov 29 16:10:58 crc kubenswrapper[4907]: I1129 16:10:58.138836 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn_90303a73-fb9d-454b-a241-ffacdb554862/util/0.log" Nov 29 16:10:58 crc kubenswrapper[4907]: I1129 16:10:58.139619 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn_90303a73-fb9d-454b-a241-ffacdb554862/pull/0.log" Nov 29 16:10:58 crc kubenswrapper[4907]: I1129 16:10:58.173460 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn_90303a73-fb9d-454b-a241-ffacdb554862/pull/0.log" Nov 29 16:10:58 crc kubenswrapper[4907]: I1129 16:10:58.442889 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn_90303a73-fb9d-454b-a241-ffacdb554862/pull/0.log" Nov 29 16:10:58 crc kubenswrapper[4907]: I1129 16:10:58.459975 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn_90303a73-fb9d-454b-a241-ffacdb554862/extract/0.log" Nov 29 16:10:58 crc kubenswrapper[4907]: I1129 16:10:58.480407 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn_90303a73-fb9d-454b-a241-ffacdb554862/util/0.log" Nov 29 16:10:58 crc kubenswrapper[4907]: I1129 16:10:58.530787 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g_665b9a04-0d24-45c5-9129-8d37342f2674/util/0.log" Nov 29 16:10:58 crc kubenswrapper[4907]: I1129 16:10:58.680569 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g_665b9a04-0d24-45c5-9129-8d37342f2674/util/0.log" Nov 29 16:10:58 crc kubenswrapper[4907]: I1129 16:10:58.695721 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g_665b9a04-0d24-45c5-9129-8d37342f2674/pull/0.log" Nov 29 16:10:58 crc kubenswrapper[4907]: I1129 16:10:58.701424 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g_665b9a04-0d24-45c5-9129-8d37342f2674/pull/0.log" Nov 29 16:10:58 crc kubenswrapper[4907]: I1129 16:10:58.868944 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g_665b9a04-0d24-45c5-9129-8d37342f2674/util/0.log" Nov 29 16:10:58 crc kubenswrapper[4907]: I1129 16:10:58.897264 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g_665b9a04-0d24-45c5-9129-8d37342f2674/pull/0.log" Nov 29 16:10:58 crc kubenswrapper[4907]: I1129 16:10:58.939060 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g_665b9a04-0d24-45c5-9129-8d37342f2674/extract/0.log" Nov 29 16:10:58 crc kubenswrapper[4907]: I1129 16:10:58.947635 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk_1bea1f3c-2bd3-4013-a502-9b9ed934f733/util/0.log" Nov 29 16:10:59 crc kubenswrapper[4907]: I1129 16:10:59.133585 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk_1bea1f3c-2bd3-4013-a502-9b9ed934f733/util/0.log" Nov 29 16:10:59 crc kubenswrapper[4907]: I1129 16:10:59.139536 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk_1bea1f3c-2bd3-4013-a502-9b9ed934f733/pull/0.log" Nov 29 16:10:59 crc kubenswrapper[4907]: I1129 16:10:59.141225 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk_1bea1f3c-2bd3-4013-a502-9b9ed934f733/pull/0.log" Nov 29 16:10:59 crc kubenswrapper[4907]: I1129 16:10:59.303900 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk_1bea1f3c-2bd3-4013-a502-9b9ed934f733/util/0.log" Nov 29 16:10:59 crc kubenswrapper[4907]: I1129 16:10:59.331069 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk_1bea1f3c-2bd3-4013-a502-9b9ed934f733/pull/0.log" Nov 29 16:10:59 crc kubenswrapper[4907]: I1129 16:10:59.383778 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bzmdm_7ce243ff-d352-42f5-82b7-57f145c149c9/extract-utilities/0.log" Nov 29 16:10:59 crc kubenswrapper[4907]: I1129 16:10:59.392855 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk_1bea1f3c-2bd3-4013-a502-9b9ed934f733/extract/0.log" Nov 29 16:10:59 crc kubenswrapper[4907]: I1129 16:10:59.585621 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bzmdm_7ce243ff-d352-42f5-82b7-57f145c149c9/extract-content/0.log" Nov 29 16:10:59 crc kubenswrapper[4907]: I1129 16:10:59.590240 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bzmdm_7ce243ff-d352-42f5-82b7-57f145c149c9/extract-utilities/0.log" Nov 29 16:10:59 crc kubenswrapper[4907]: I1129 16:10:59.608924 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bzmdm_7ce243ff-d352-42f5-82b7-57f145c149c9/extract-content/0.log" Nov 29 16:10:59 crc kubenswrapper[4907]: I1129 16:10:59.769509 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bzmdm_7ce243ff-d352-42f5-82b7-57f145c149c9/extract-content/0.log" Nov 29 16:10:59 crc kubenswrapper[4907]: I1129 16:10:59.809232 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bzmdm_7ce243ff-d352-42f5-82b7-57f145c149c9/extract-utilities/0.log" Nov 29 16:10:59 crc kubenswrapper[4907]: I1129 16:10:59.825641 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxvjq_aa2b263a-b49b-4b7e-bcd7-f17b707db54a/extract-utilities/0.log" Nov 29 16:11:00 crc kubenswrapper[4907]: I1129 16:11:00.045987 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxvjq_aa2b263a-b49b-4b7e-bcd7-f17b707db54a/extract-content/0.log" Nov 29 16:11:00 crc kubenswrapper[4907]: I1129 16:11:00.080392 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxvjq_aa2b263a-b49b-4b7e-bcd7-f17b707db54a/extract-utilities/0.log" Nov 29 16:11:00 crc kubenswrapper[4907]: I1129 16:11:00.086924 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxvjq_aa2b263a-b49b-4b7e-bcd7-f17b707db54a/extract-content/0.log" Nov 29 16:11:00 crc kubenswrapper[4907]: I1129 16:11:00.288700 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxvjq_aa2b263a-b49b-4b7e-bcd7-f17b707db54a/extract-content/0.log" Nov 29 16:11:00 crc kubenswrapper[4907]: I1129 16:11:00.300996 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxvjq_aa2b263a-b49b-4b7e-bcd7-f17b707db54a/extract-utilities/0.log" Nov 29 16:11:00 crc kubenswrapper[4907]: I1129 16:11:00.542890 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-ltxlt_7964d25d-6ab7-44e4-9737-41d44ea2a311/marketplace-operator/0.log" Nov 29 16:11:00 crc kubenswrapper[4907]: I1129 16:11:00.584487 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bzmdm_7ce243ff-d352-42f5-82b7-57f145c149c9/registry-server/0.log" Nov 29 16:11:00 crc kubenswrapper[4907]: I1129 16:11:00.631727 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-95kn8_5171ef24-3274-44d5-8d36-8d5be3534c2a/extract-utilities/0.log" Nov 29 16:11:00 crc kubenswrapper[4907]: I1129 16:11:00.698216 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxvjq_aa2b263a-b49b-4b7e-bcd7-f17b707db54a/registry-server/0.log" Nov 29 16:11:00 crc kubenswrapper[4907]: I1129 16:11:00.859344 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-95kn8_5171ef24-3274-44d5-8d36-8d5be3534c2a/extract-content/0.log" Nov 29 16:11:00 crc kubenswrapper[4907]: I1129 16:11:00.859507 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-95kn8_5171ef24-3274-44d5-8d36-8d5be3534c2a/extract-content/0.log" Nov 29 16:11:00 crc kubenswrapper[4907]: I1129 16:11:00.867078 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-95kn8_5171ef24-3274-44d5-8d36-8d5be3534c2a/extract-utilities/0.log" Nov 29 16:11:01 crc kubenswrapper[4907]: I1129 16:11:01.073794 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-95kn8_5171ef24-3274-44d5-8d36-8d5be3534c2a/extract-utilities/0.log" Nov 29 16:11:01 crc kubenswrapper[4907]: I1129 16:11:01.094633 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-95kn8_5171ef24-3274-44d5-8d36-8d5be3534c2a/extract-content/0.log" Nov 29 16:11:01 crc kubenswrapper[4907]: I1129 16:11:01.106819 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-58pkc_7315fc63-0710-4bcc-a67a-6c2c649192d0/extract-utilities/0.log" Nov 29 16:11:01 crc kubenswrapper[4907]: I1129 16:11:01.276846 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-95kn8_5171ef24-3274-44d5-8d36-8d5be3534c2a/registry-server/0.log" Nov 29 16:11:01 crc kubenswrapper[4907]: I1129 16:11:01.316663 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-58pkc_7315fc63-0710-4bcc-a67a-6c2c649192d0/extract-utilities/0.log" Nov 29 16:11:01 crc kubenswrapper[4907]: I1129 16:11:01.328020 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-58pkc_7315fc63-0710-4bcc-a67a-6c2c649192d0/extract-content/0.log" Nov 29 16:11:01 crc kubenswrapper[4907]: I1129 16:11:01.360192 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-58pkc_7315fc63-0710-4bcc-a67a-6c2c649192d0/extract-content/0.log" Nov 29 16:11:01 crc kubenswrapper[4907]: I1129 16:11:01.543276 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-58pkc_7315fc63-0710-4bcc-a67a-6c2c649192d0/extract-utilities/0.log" Nov 29 16:11:01 crc kubenswrapper[4907]: I1129 16:11:01.579729 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-58pkc_7315fc63-0710-4bcc-a67a-6c2c649192d0/extract-content/0.log" Nov 29 16:11:02 crc kubenswrapper[4907]: I1129 16:11:02.308641 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-58pkc_7315fc63-0710-4bcc-a67a-6c2c649192d0/registry-server/0.log" Nov 29 16:11:17 crc kubenswrapper[4907]: I1129 16:11:17.895553 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-jt7c8_5c8cbe86-4142-478f-add6-b7d0baf83de6/prometheus-operator/0.log" Nov 29 16:11:18 crc kubenswrapper[4907]: I1129 16:11:18.062162 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9_4b633428-8d76-48d9-bde6-b6233e1d7f40/prometheus-operator-admission-webhook/0.log" Nov 29 16:11:18 crc kubenswrapper[4907]: I1129 16:11:18.094452 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc_21df79d3-1565-4ab3-bdff-8f63941a44f2/prometheus-operator-admission-webhook/0.log" Nov 29 16:11:18 crc kubenswrapper[4907]: I1129 16:11:18.339052 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-v6mrf_985db950-2dae-4f2f-8ea4-289b661b1481/observability-ui-dashboards/0.log" Nov 29 16:11:18 crc kubenswrapper[4907]: I1129 16:11:18.342594 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-sks2s_9e59fbf3-ac79-42b2-84c9-f2afa27c4efb/operator/0.log" Nov 29 16:11:18 crc kubenswrapper[4907]: I1129 16:11:18.519306 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-5dnkd_258d53e6-9789-4a47-8c51-e928f0ad0f6b/perses-operator/0.log" Nov 29 16:11:34 crc kubenswrapper[4907]: I1129 16:11:34.183866 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6ddbc98977-wnwpz_faed25bd-9bb2-4409-927a-e70521fb534c/kube-rbac-proxy/0.log" Nov 29 16:11:34 crc kubenswrapper[4907]: I1129 16:11:34.285594 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6ddbc98977-wnwpz_faed25bd-9bb2-4409-927a-e70521fb534c/manager/0.log" Nov 29 16:11:46 crc kubenswrapper[4907]: I1129 16:11:46.541663 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wklqt"] Nov 29 16:11:46 crc kubenswrapper[4907]: E1129 16:11:46.550460 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb07f57-3822-43f0-ac31-e21721061bd8" containerName="container-00" Nov 29 16:11:46 crc kubenswrapper[4907]: I1129 16:11:46.550492 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb07f57-3822-43f0-ac31-e21721061bd8" containerName="container-00" Nov 29 16:11:46 crc kubenswrapper[4907]: I1129 16:11:46.550815 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb07f57-3822-43f0-ac31-e21721061bd8" containerName="container-00" Nov 29 16:11:46 crc kubenswrapper[4907]: I1129 16:11:46.553892 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:46 crc kubenswrapper[4907]: I1129 16:11:46.597228 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wklqt"] Nov 29 16:11:46 crc kubenswrapper[4907]: I1129 16:11:46.671876 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9daa6019-f38b-42bb-a7fc-207bc1b75e45-utilities\") pod \"community-operators-wklqt\" (UID: \"9daa6019-f38b-42bb-a7fc-207bc1b75e45\") " pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:46 crc kubenswrapper[4907]: I1129 16:11:46.672047 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9daa6019-f38b-42bb-a7fc-207bc1b75e45-catalog-content\") pod \"community-operators-wklqt\" (UID: \"9daa6019-f38b-42bb-a7fc-207bc1b75e45\") " pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:46 crc kubenswrapper[4907]: I1129 16:11:46.672195 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdn9l\" (UniqueName: \"kubernetes.io/projected/9daa6019-f38b-42bb-a7fc-207bc1b75e45-kube-api-access-xdn9l\") pod \"community-operators-wklqt\" (UID: \"9daa6019-f38b-42bb-a7fc-207bc1b75e45\") " pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:46 crc kubenswrapper[4907]: E1129 16:11:46.725755 4907 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.47:56250->38.102.83.47:43783: write tcp 38.102.83.47:56250->38.102.83.47:43783: write: broken pipe Nov 29 16:11:46 crc kubenswrapper[4907]: I1129 16:11:46.775003 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9daa6019-f38b-42bb-a7fc-207bc1b75e45-catalog-content\") pod \"community-operators-wklqt\" (UID: \"9daa6019-f38b-42bb-a7fc-207bc1b75e45\") " pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:46 crc kubenswrapper[4907]: I1129 16:11:46.775108 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdn9l\" (UniqueName: \"kubernetes.io/projected/9daa6019-f38b-42bb-a7fc-207bc1b75e45-kube-api-access-xdn9l\") pod \"community-operators-wklqt\" (UID: \"9daa6019-f38b-42bb-a7fc-207bc1b75e45\") " pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:46 crc kubenswrapper[4907]: I1129 16:11:46.775181 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9daa6019-f38b-42bb-a7fc-207bc1b75e45-utilities\") pod \"community-operators-wklqt\" (UID: \"9daa6019-f38b-42bb-a7fc-207bc1b75e45\") " pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:46 crc kubenswrapper[4907]: I1129 16:11:46.775717 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9daa6019-f38b-42bb-a7fc-207bc1b75e45-utilities\") pod \"community-operators-wklqt\" (UID: \"9daa6019-f38b-42bb-a7fc-207bc1b75e45\") " pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:46 crc kubenswrapper[4907]: I1129 16:11:46.775928 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9daa6019-f38b-42bb-a7fc-207bc1b75e45-catalog-content\") pod \"community-operators-wklqt\" (UID: \"9daa6019-f38b-42bb-a7fc-207bc1b75e45\") " pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:46 crc kubenswrapper[4907]: I1129 16:11:46.808567 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdn9l\" (UniqueName: \"kubernetes.io/projected/9daa6019-f38b-42bb-a7fc-207bc1b75e45-kube-api-access-xdn9l\") pod \"community-operators-wklqt\" (UID: \"9daa6019-f38b-42bb-a7fc-207bc1b75e45\") " pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:46 crc kubenswrapper[4907]: I1129 16:11:46.880434 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:47 crc kubenswrapper[4907]: I1129 16:11:47.820217 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wklqt"] Nov 29 16:11:48 crc kubenswrapper[4907]: I1129 16:11:48.112124 4907 generic.go:334] "Generic (PLEG): container finished" podID="9daa6019-f38b-42bb-a7fc-207bc1b75e45" containerID="72a744669de38a48170f83af918032fcaa397fce417e11a5618d9abc1ed27ea8" exitCode=0 Nov 29 16:11:48 crc kubenswrapper[4907]: I1129 16:11:48.112486 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wklqt" event={"ID":"9daa6019-f38b-42bb-a7fc-207bc1b75e45","Type":"ContainerDied","Data":"72a744669de38a48170f83af918032fcaa397fce417e11a5618d9abc1ed27ea8"} Nov 29 16:11:48 crc kubenswrapper[4907]: I1129 16:11:48.112515 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wklqt" event={"ID":"9daa6019-f38b-42bb-a7fc-207bc1b75e45","Type":"ContainerStarted","Data":"ab0df1584063d488fc0848cb226f60cc42f3af0b5f54ccc957d0add268803ca4"} Nov 29 16:11:48 crc kubenswrapper[4907]: I1129 16:11:48.123083 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 16:11:49 crc kubenswrapper[4907]: I1129 16:11:49.127380 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wklqt" event={"ID":"9daa6019-f38b-42bb-a7fc-207bc1b75e45","Type":"ContainerStarted","Data":"1b838ee08d125ef4e25e2e6cdc835cdedb5530c156cee29819d3555b1392e8a2"} Nov 29 16:11:50 crc kubenswrapper[4907]: I1129 16:11:50.140953 4907 generic.go:334] "Generic (PLEG): container finished" podID="9daa6019-f38b-42bb-a7fc-207bc1b75e45" containerID="1b838ee08d125ef4e25e2e6cdc835cdedb5530c156cee29819d3555b1392e8a2" exitCode=0 Nov 29 16:11:50 crc kubenswrapper[4907]: I1129 16:11:50.141003 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wklqt" event={"ID":"9daa6019-f38b-42bb-a7fc-207bc1b75e45","Type":"ContainerDied","Data":"1b838ee08d125ef4e25e2e6cdc835cdedb5530c156cee29819d3555b1392e8a2"} Nov 29 16:11:51 crc kubenswrapper[4907]: I1129 16:11:51.152859 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wklqt" event={"ID":"9daa6019-f38b-42bb-a7fc-207bc1b75e45","Type":"ContainerStarted","Data":"39dd0bae34d0d7c34cb024b3d38dff11ba906a54fe5e6be825f0be701fe16dbe"} Nov 29 16:11:51 crc kubenswrapper[4907]: I1129 16:11:51.182338 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wklqt" podStartSLOduration=2.727867879 podStartE2EDuration="5.182321825s" podCreationTimestamp="2025-11-29 16:11:46 +0000 UTC" firstStartedPulling="2025-11-29 16:11:48.122082413 +0000 UTC m=+6206.108920065" lastFinishedPulling="2025-11-29 16:11:50.576536359 +0000 UTC m=+6208.563374011" observedRunningTime="2025-11-29 16:11:51.177764998 +0000 UTC m=+6209.164602641" watchObservedRunningTime="2025-11-29 16:11:51.182321825 +0000 UTC m=+6209.169159477" Nov 29 16:11:54 crc kubenswrapper[4907]: E1129 16:11:54.405770 4907 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.47:35896->38.102.83.47:43783: write tcp 38.102.83.47:35896->38.102.83.47:43783: write: broken pipe Nov 29 16:11:56 crc kubenswrapper[4907]: I1129 16:11:56.882051 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:56 crc kubenswrapper[4907]: I1129 16:11:56.882655 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:56 crc kubenswrapper[4907]: I1129 16:11:56.930762 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:57 crc kubenswrapper[4907]: I1129 16:11:57.275605 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:57 crc kubenswrapper[4907]: I1129 16:11:57.337489 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wklqt"] Nov 29 16:11:58 crc kubenswrapper[4907]: I1129 16:11:58.493245 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 16:11:58 crc kubenswrapper[4907]: I1129 16:11:58.495718 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 16:11:59 crc kubenswrapper[4907]: I1129 16:11:59.238188 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wklqt" podUID="9daa6019-f38b-42bb-a7fc-207bc1b75e45" containerName="registry-server" containerID="cri-o://39dd0bae34d0d7c34cb024b3d38dff11ba906a54fe5e6be825f0be701fe16dbe" gracePeriod=2 Nov 29 16:11:59 crc kubenswrapper[4907]: I1129 16:11:59.824182 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:11:59 crc kubenswrapper[4907]: I1129 16:11:59.910316 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9daa6019-f38b-42bb-a7fc-207bc1b75e45-catalog-content\") pod \"9daa6019-f38b-42bb-a7fc-207bc1b75e45\" (UID: \"9daa6019-f38b-42bb-a7fc-207bc1b75e45\") " Nov 29 16:11:59 crc kubenswrapper[4907]: I1129 16:11:59.910671 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdn9l\" (UniqueName: \"kubernetes.io/projected/9daa6019-f38b-42bb-a7fc-207bc1b75e45-kube-api-access-xdn9l\") pod \"9daa6019-f38b-42bb-a7fc-207bc1b75e45\" (UID: \"9daa6019-f38b-42bb-a7fc-207bc1b75e45\") " Nov 29 16:11:59 crc kubenswrapper[4907]: I1129 16:11:59.910908 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9daa6019-f38b-42bb-a7fc-207bc1b75e45-utilities\") pod \"9daa6019-f38b-42bb-a7fc-207bc1b75e45\" (UID: \"9daa6019-f38b-42bb-a7fc-207bc1b75e45\") " Nov 29 16:11:59 crc kubenswrapper[4907]: I1129 16:11:59.911663 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9daa6019-f38b-42bb-a7fc-207bc1b75e45-utilities" (OuterVolumeSpecName: "utilities") pod "9daa6019-f38b-42bb-a7fc-207bc1b75e45" (UID: "9daa6019-f38b-42bb-a7fc-207bc1b75e45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:11:59 crc kubenswrapper[4907]: I1129 16:11:59.917378 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9daa6019-f38b-42bb-a7fc-207bc1b75e45-kube-api-access-xdn9l" (OuterVolumeSpecName: "kube-api-access-xdn9l") pod "9daa6019-f38b-42bb-a7fc-207bc1b75e45" (UID: "9daa6019-f38b-42bb-a7fc-207bc1b75e45"). InnerVolumeSpecName "kube-api-access-xdn9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:11:59 crc kubenswrapper[4907]: I1129 16:11:59.958938 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9daa6019-f38b-42bb-a7fc-207bc1b75e45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9daa6019-f38b-42bb-a7fc-207bc1b75e45" (UID: "9daa6019-f38b-42bb-a7fc-207bc1b75e45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.013667 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9daa6019-f38b-42bb-a7fc-207bc1b75e45-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.013705 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdn9l\" (UniqueName: \"kubernetes.io/projected/9daa6019-f38b-42bb-a7fc-207bc1b75e45-kube-api-access-xdn9l\") on node \"crc\" DevicePath \"\"" Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.013715 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9daa6019-f38b-42bb-a7fc-207bc1b75e45-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.256254 4907 generic.go:334] "Generic (PLEG): container finished" podID="9daa6019-f38b-42bb-a7fc-207bc1b75e45" containerID="39dd0bae34d0d7c34cb024b3d38dff11ba906a54fe5e6be825f0be701fe16dbe" exitCode=0 Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.256336 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wklqt" Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.256366 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wklqt" event={"ID":"9daa6019-f38b-42bb-a7fc-207bc1b75e45","Type":"ContainerDied","Data":"39dd0bae34d0d7c34cb024b3d38dff11ba906a54fe5e6be825f0be701fe16dbe"} Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.256924 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wklqt" event={"ID":"9daa6019-f38b-42bb-a7fc-207bc1b75e45","Type":"ContainerDied","Data":"ab0df1584063d488fc0848cb226f60cc42f3af0b5f54ccc957d0add268803ca4"} Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.256958 4907 scope.go:117] "RemoveContainer" containerID="39dd0bae34d0d7c34cb024b3d38dff11ba906a54fe5e6be825f0be701fe16dbe" Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.334756 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wklqt"] Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.341392 4907 scope.go:117] "RemoveContainer" containerID="1b838ee08d125ef4e25e2e6cdc835cdedb5530c156cee29819d3555b1392e8a2" Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.351085 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wklqt"] Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.383154 4907 scope.go:117] "RemoveContainer" containerID="72a744669de38a48170f83af918032fcaa397fce417e11a5618d9abc1ed27ea8" Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.437027 4907 scope.go:117] "RemoveContainer" containerID="39dd0bae34d0d7c34cb024b3d38dff11ba906a54fe5e6be825f0be701fe16dbe" Nov 29 16:12:00 crc kubenswrapper[4907]: E1129 16:12:00.437666 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39dd0bae34d0d7c34cb024b3d38dff11ba906a54fe5e6be825f0be701fe16dbe\": container with ID starting with 39dd0bae34d0d7c34cb024b3d38dff11ba906a54fe5e6be825f0be701fe16dbe not found: ID does not exist" containerID="39dd0bae34d0d7c34cb024b3d38dff11ba906a54fe5e6be825f0be701fe16dbe" Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.437703 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39dd0bae34d0d7c34cb024b3d38dff11ba906a54fe5e6be825f0be701fe16dbe"} err="failed to get container status \"39dd0bae34d0d7c34cb024b3d38dff11ba906a54fe5e6be825f0be701fe16dbe\": rpc error: code = NotFound desc = could not find container \"39dd0bae34d0d7c34cb024b3d38dff11ba906a54fe5e6be825f0be701fe16dbe\": container with ID starting with 39dd0bae34d0d7c34cb024b3d38dff11ba906a54fe5e6be825f0be701fe16dbe not found: ID does not exist" Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.437730 4907 scope.go:117] "RemoveContainer" containerID="1b838ee08d125ef4e25e2e6cdc835cdedb5530c156cee29819d3555b1392e8a2" Nov 29 16:12:00 crc kubenswrapper[4907]: E1129 16:12:00.438158 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b838ee08d125ef4e25e2e6cdc835cdedb5530c156cee29819d3555b1392e8a2\": container with ID starting with 1b838ee08d125ef4e25e2e6cdc835cdedb5530c156cee29819d3555b1392e8a2 not found: ID does not exist" containerID="1b838ee08d125ef4e25e2e6cdc835cdedb5530c156cee29819d3555b1392e8a2" Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.438196 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b838ee08d125ef4e25e2e6cdc835cdedb5530c156cee29819d3555b1392e8a2"} err="failed to get container status \"1b838ee08d125ef4e25e2e6cdc835cdedb5530c156cee29819d3555b1392e8a2\": rpc error: code = NotFound desc = could not find container \"1b838ee08d125ef4e25e2e6cdc835cdedb5530c156cee29819d3555b1392e8a2\": container with ID starting with 1b838ee08d125ef4e25e2e6cdc835cdedb5530c156cee29819d3555b1392e8a2 not found: ID does not exist" Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.438219 4907 scope.go:117] "RemoveContainer" containerID="72a744669de38a48170f83af918032fcaa397fce417e11a5618d9abc1ed27ea8" Nov 29 16:12:00 crc kubenswrapper[4907]: E1129 16:12:00.439144 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72a744669de38a48170f83af918032fcaa397fce417e11a5618d9abc1ed27ea8\": container with ID starting with 72a744669de38a48170f83af918032fcaa397fce417e11a5618d9abc1ed27ea8 not found: ID does not exist" containerID="72a744669de38a48170f83af918032fcaa397fce417e11a5618d9abc1ed27ea8" Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.439189 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72a744669de38a48170f83af918032fcaa397fce417e11a5618d9abc1ed27ea8"} err="failed to get container status \"72a744669de38a48170f83af918032fcaa397fce417e11a5618d9abc1ed27ea8\": rpc error: code = NotFound desc = could not find container \"72a744669de38a48170f83af918032fcaa397fce417e11a5618d9abc1ed27ea8\": container with ID starting with 72a744669de38a48170f83af918032fcaa397fce417e11a5618d9abc1ed27ea8 not found: ID does not exist" Nov 29 16:12:00 crc kubenswrapper[4907]: I1129 16:12:00.496029 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9daa6019-f38b-42bb-a7fc-207bc1b75e45" path="/var/lib/kubelet/pods/9daa6019-f38b-42bb-a7fc-207bc1b75e45/volumes" Nov 29 16:12:28 crc kubenswrapper[4907]: I1129 16:12:28.489858 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 16:12:28 crc kubenswrapper[4907]: I1129 16:12:28.490303 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.010847 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-phf2n"] Nov 29 16:12:47 crc kubenswrapper[4907]: E1129 16:12:47.012210 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9daa6019-f38b-42bb-a7fc-207bc1b75e45" containerName="extract-utilities" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.012230 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9daa6019-f38b-42bb-a7fc-207bc1b75e45" containerName="extract-utilities" Nov 29 16:12:47 crc kubenswrapper[4907]: E1129 16:12:47.012275 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9daa6019-f38b-42bb-a7fc-207bc1b75e45" containerName="registry-server" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.012286 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9daa6019-f38b-42bb-a7fc-207bc1b75e45" containerName="registry-server" Nov 29 16:12:47 crc kubenswrapper[4907]: E1129 16:12:47.012328 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9daa6019-f38b-42bb-a7fc-207bc1b75e45" containerName="extract-content" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.012337 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9daa6019-f38b-42bb-a7fc-207bc1b75e45" containerName="extract-content" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.012675 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9daa6019-f38b-42bb-a7fc-207bc1b75e45" containerName="registry-server" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.014929 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.024789 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-phf2n"] Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.083428 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-catalog-content\") pod \"redhat-marketplace-phf2n\" (UID: \"2d8c8723-c58b-4429-bd49-b28f8a8d0e45\") " pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.083829 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5np7r\" (UniqueName: \"kubernetes.io/projected/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-kube-api-access-5np7r\") pod \"redhat-marketplace-phf2n\" (UID: \"2d8c8723-c58b-4429-bd49-b28f8a8d0e45\") " pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.084092 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-utilities\") pod \"redhat-marketplace-phf2n\" (UID: \"2d8c8723-c58b-4429-bd49-b28f8a8d0e45\") " pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.186157 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-catalog-content\") pod \"redhat-marketplace-phf2n\" (UID: \"2d8c8723-c58b-4429-bd49-b28f8a8d0e45\") " pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.186236 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5np7r\" (UniqueName: \"kubernetes.io/projected/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-kube-api-access-5np7r\") pod \"redhat-marketplace-phf2n\" (UID: \"2d8c8723-c58b-4429-bd49-b28f8a8d0e45\") " pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.186427 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-utilities\") pod \"redhat-marketplace-phf2n\" (UID: \"2d8c8723-c58b-4429-bd49-b28f8a8d0e45\") " pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.186826 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-catalog-content\") pod \"redhat-marketplace-phf2n\" (UID: \"2d8c8723-c58b-4429-bd49-b28f8a8d0e45\") " pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.186966 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-utilities\") pod \"redhat-marketplace-phf2n\" (UID: \"2d8c8723-c58b-4429-bd49-b28f8a8d0e45\") " pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.211202 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5np7r\" (UniqueName: \"kubernetes.io/projected/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-kube-api-access-5np7r\") pod \"redhat-marketplace-phf2n\" (UID: \"2d8c8723-c58b-4429-bd49-b28f8a8d0e45\") " pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.341402 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.866606 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-phf2n"] Nov 29 16:12:47 crc kubenswrapper[4907]: I1129 16:12:47.997624 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-phf2n" event={"ID":"2d8c8723-c58b-4429-bd49-b28f8a8d0e45","Type":"ContainerStarted","Data":"773470f7c8adad6b51d23eb3454a6f5f1cfc18c6975069e9a4d3d0fc7f58aa2a"} Nov 29 16:12:48 crc kubenswrapper[4907]: E1129 16:12:48.143528 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d8c8723_c58b_4429_bd49_b28f8a8d0e45.slice/crio-6bb690c7233df256233a70b3b55d5a306e565cdeec70486cdd0fb61f94be8346.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d8c8723_c58b_4429_bd49_b28f8a8d0e45.slice/crio-conmon-6bb690c7233df256233a70b3b55d5a306e565cdeec70486cdd0fb61f94be8346.scope\": RecentStats: unable to find data in memory cache]" Nov 29 16:12:48 crc kubenswrapper[4907]: E1129 16:12:48.143613 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d8c8723_c58b_4429_bd49_b28f8a8d0e45.slice/crio-conmon-6bb690c7233df256233a70b3b55d5a306e565cdeec70486cdd0fb61f94be8346.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d8c8723_c58b_4429_bd49_b28f8a8d0e45.slice/crio-6bb690c7233df256233a70b3b55d5a306e565cdeec70486cdd0fb61f94be8346.scope\": RecentStats: unable to find data in memory cache]" Nov 29 16:12:49 crc kubenswrapper[4907]: I1129 16:12:49.018363 4907 generic.go:334] "Generic (PLEG): container finished" podID="2d8c8723-c58b-4429-bd49-b28f8a8d0e45" containerID="6bb690c7233df256233a70b3b55d5a306e565cdeec70486cdd0fb61f94be8346" exitCode=0 Nov 29 16:12:49 crc kubenswrapper[4907]: I1129 16:12:49.018461 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-phf2n" event={"ID":"2d8c8723-c58b-4429-bd49-b28f8a8d0e45","Type":"ContainerDied","Data":"6bb690c7233df256233a70b3b55d5a306e565cdeec70486cdd0fb61f94be8346"} Nov 29 16:12:51 crc kubenswrapper[4907]: I1129 16:12:51.048003 4907 generic.go:334] "Generic (PLEG): container finished" podID="2d8c8723-c58b-4429-bd49-b28f8a8d0e45" containerID="e8d8c1b68e3e982ad119e31168bf83bf2ab674b6b65332ca107a7df1806461e9" exitCode=0 Nov 29 16:12:51 crc kubenswrapper[4907]: I1129 16:12:51.048068 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-phf2n" event={"ID":"2d8c8723-c58b-4429-bd49-b28f8a8d0e45","Type":"ContainerDied","Data":"e8d8c1b68e3e982ad119e31168bf83bf2ab674b6b65332ca107a7df1806461e9"} Nov 29 16:12:52 crc kubenswrapper[4907]: I1129 16:12:52.062887 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-phf2n" event={"ID":"2d8c8723-c58b-4429-bd49-b28f8a8d0e45","Type":"ContainerStarted","Data":"036c4d783e34f4a352f5a86c3815a039173cc12091883ebb1969daa82195c58d"} Nov 29 16:12:52 crc kubenswrapper[4907]: I1129 16:12:52.093914 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-phf2n" podStartSLOduration=3.340876903 podStartE2EDuration="6.093891591s" podCreationTimestamp="2025-11-29 16:12:46 +0000 UTC" firstStartedPulling="2025-11-29 16:12:49.021184001 +0000 UTC m=+6267.008021663" lastFinishedPulling="2025-11-29 16:12:51.774198689 +0000 UTC m=+6269.761036351" observedRunningTime="2025-11-29 16:12:52.089087177 +0000 UTC m=+6270.075924869" watchObservedRunningTime="2025-11-29 16:12:52.093891591 +0000 UTC m=+6270.080729273" Nov 29 16:12:57 crc kubenswrapper[4907]: I1129 16:12:57.342296 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:12:57 crc kubenswrapper[4907]: I1129 16:12:57.342887 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:12:57 crc kubenswrapper[4907]: I1129 16:12:57.398796 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:12:58 crc kubenswrapper[4907]: I1129 16:12:58.200226 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:12:58 crc kubenswrapper[4907]: I1129 16:12:58.271949 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-phf2n"] Nov 29 16:12:58 crc kubenswrapper[4907]: I1129 16:12:58.495270 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 16:12:58 crc kubenswrapper[4907]: I1129 16:12:58.495343 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 16:12:58 crc kubenswrapper[4907]: I1129 16:12:58.507488 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 16:12:58 crc kubenswrapper[4907]: I1129 16:12:58.512886 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c9e246176adde8029f4d2d0175676987ac7f82b41e64c176c55b0095d1e178e"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 16:12:58 crc kubenswrapper[4907]: I1129 16:12:58.513717 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://5c9e246176adde8029f4d2d0175676987ac7f82b41e64c176c55b0095d1e178e" gracePeriod=600 Nov 29 16:12:59 crc kubenswrapper[4907]: I1129 16:12:59.149359 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="5c9e246176adde8029f4d2d0175676987ac7f82b41e64c176c55b0095d1e178e" exitCode=0 Nov 29 16:12:59 crc kubenswrapper[4907]: I1129 16:12:59.151357 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"5c9e246176adde8029f4d2d0175676987ac7f82b41e64c176c55b0095d1e178e"} Nov 29 16:12:59 crc kubenswrapper[4907]: I1129 16:12:59.151423 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4"} Nov 29 16:12:59 crc kubenswrapper[4907]: I1129 16:12:59.151490 4907 scope.go:117] "RemoveContainer" containerID="303f8026fad25fd66deeac5832651459499cdc3c895fac5e1de04840775e029e" Nov 29 16:13:00 crc kubenswrapper[4907]: I1129 16:13:00.168221 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-phf2n" podUID="2d8c8723-c58b-4429-bd49-b28f8a8d0e45" containerName="registry-server" containerID="cri-o://036c4d783e34f4a352f5a86c3815a039173cc12091883ebb1969daa82195c58d" gracePeriod=2 Nov 29 16:13:00 crc kubenswrapper[4907]: I1129 16:13:00.784782 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:13:00 crc kubenswrapper[4907]: I1129 16:13:00.928330 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-catalog-content\") pod \"2d8c8723-c58b-4429-bd49-b28f8a8d0e45\" (UID: \"2d8c8723-c58b-4429-bd49-b28f8a8d0e45\") " Nov 29 16:13:00 crc kubenswrapper[4907]: I1129 16:13:00.928649 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5np7r\" (UniqueName: \"kubernetes.io/projected/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-kube-api-access-5np7r\") pod \"2d8c8723-c58b-4429-bd49-b28f8a8d0e45\" (UID: \"2d8c8723-c58b-4429-bd49-b28f8a8d0e45\") " Nov 29 16:13:00 crc kubenswrapper[4907]: I1129 16:13:00.928871 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-utilities\") pod \"2d8c8723-c58b-4429-bd49-b28f8a8d0e45\" (UID: \"2d8c8723-c58b-4429-bd49-b28f8a8d0e45\") " Nov 29 16:13:00 crc kubenswrapper[4907]: I1129 16:13:00.929717 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-utilities" (OuterVolumeSpecName: "utilities") pod "2d8c8723-c58b-4429-bd49-b28f8a8d0e45" (UID: "2d8c8723-c58b-4429-bd49-b28f8a8d0e45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:13:00 crc kubenswrapper[4907]: I1129 16:13:00.936701 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-kube-api-access-5np7r" (OuterVolumeSpecName: "kube-api-access-5np7r") pod "2d8c8723-c58b-4429-bd49-b28f8a8d0e45" (UID: "2d8c8723-c58b-4429-bd49-b28f8a8d0e45"). InnerVolumeSpecName "kube-api-access-5np7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:13:00 crc kubenswrapper[4907]: I1129 16:13:00.949914 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d8c8723-c58b-4429-bd49-b28f8a8d0e45" (UID: "2d8c8723-c58b-4429-bd49-b28f8a8d0e45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.031602 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.031805 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.031875 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5np7r\" (UniqueName: \"kubernetes.io/projected/2d8c8723-c58b-4429-bd49-b28f8a8d0e45-kube-api-access-5np7r\") on node \"crc\" DevicePath \"\"" Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.192453 4907 generic.go:334] "Generic (PLEG): container finished" podID="2d8c8723-c58b-4429-bd49-b28f8a8d0e45" containerID="036c4d783e34f4a352f5a86c3815a039173cc12091883ebb1969daa82195c58d" exitCode=0 Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.192505 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-phf2n" event={"ID":"2d8c8723-c58b-4429-bd49-b28f8a8d0e45","Type":"ContainerDied","Data":"036c4d783e34f4a352f5a86c3815a039173cc12091883ebb1969daa82195c58d"} Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.192591 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-phf2n" event={"ID":"2d8c8723-c58b-4429-bd49-b28f8a8d0e45","Type":"ContainerDied","Data":"773470f7c8adad6b51d23eb3454a6f5f1cfc18c6975069e9a4d3d0fc7f58aa2a"} Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.192624 4907 scope.go:117] "RemoveContainer" containerID="036c4d783e34f4a352f5a86c3815a039173cc12091883ebb1969daa82195c58d" Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.193974 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-phf2n" Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.243324 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-phf2n"] Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.253858 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-phf2n"] Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.269644 4907 scope.go:117] "RemoveContainer" containerID="e8d8c1b68e3e982ad119e31168bf83bf2ab674b6b65332ca107a7df1806461e9" Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.296719 4907 scope.go:117] "RemoveContainer" containerID="6bb690c7233df256233a70b3b55d5a306e565cdeec70486cdd0fb61f94be8346" Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.370597 4907 scope.go:117] "RemoveContainer" containerID="036c4d783e34f4a352f5a86c3815a039173cc12091883ebb1969daa82195c58d" Nov 29 16:13:01 crc kubenswrapper[4907]: E1129 16:13:01.371248 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"036c4d783e34f4a352f5a86c3815a039173cc12091883ebb1969daa82195c58d\": container with ID starting with 036c4d783e34f4a352f5a86c3815a039173cc12091883ebb1969daa82195c58d not found: ID does not exist" containerID="036c4d783e34f4a352f5a86c3815a039173cc12091883ebb1969daa82195c58d" Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.371400 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"036c4d783e34f4a352f5a86c3815a039173cc12091883ebb1969daa82195c58d"} err="failed to get container status \"036c4d783e34f4a352f5a86c3815a039173cc12091883ebb1969daa82195c58d\": rpc error: code = NotFound desc = could not find container \"036c4d783e34f4a352f5a86c3815a039173cc12091883ebb1969daa82195c58d\": container with ID starting with 036c4d783e34f4a352f5a86c3815a039173cc12091883ebb1969daa82195c58d not found: ID does not exist" Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.371425 4907 scope.go:117] "RemoveContainer" containerID="e8d8c1b68e3e982ad119e31168bf83bf2ab674b6b65332ca107a7df1806461e9" Nov 29 16:13:01 crc kubenswrapper[4907]: E1129 16:13:01.371730 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8d8c1b68e3e982ad119e31168bf83bf2ab674b6b65332ca107a7df1806461e9\": container with ID starting with e8d8c1b68e3e982ad119e31168bf83bf2ab674b6b65332ca107a7df1806461e9 not found: ID does not exist" containerID="e8d8c1b68e3e982ad119e31168bf83bf2ab674b6b65332ca107a7df1806461e9" Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.371749 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8d8c1b68e3e982ad119e31168bf83bf2ab674b6b65332ca107a7df1806461e9"} err="failed to get container status \"e8d8c1b68e3e982ad119e31168bf83bf2ab674b6b65332ca107a7df1806461e9\": rpc error: code = NotFound desc = could not find container \"e8d8c1b68e3e982ad119e31168bf83bf2ab674b6b65332ca107a7df1806461e9\": container with ID starting with e8d8c1b68e3e982ad119e31168bf83bf2ab674b6b65332ca107a7df1806461e9 not found: ID does not exist" Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.371766 4907 scope.go:117] "RemoveContainer" containerID="6bb690c7233df256233a70b3b55d5a306e565cdeec70486cdd0fb61f94be8346" Nov 29 16:13:01 crc kubenswrapper[4907]: E1129 16:13:01.372160 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bb690c7233df256233a70b3b55d5a306e565cdeec70486cdd0fb61f94be8346\": container with ID starting with 6bb690c7233df256233a70b3b55d5a306e565cdeec70486cdd0fb61f94be8346 not found: ID does not exist" containerID="6bb690c7233df256233a70b3b55d5a306e565cdeec70486cdd0fb61f94be8346" Nov 29 16:13:01 crc kubenswrapper[4907]: I1129 16:13:01.372197 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bb690c7233df256233a70b3b55d5a306e565cdeec70486cdd0fb61f94be8346"} err="failed to get container status \"6bb690c7233df256233a70b3b55d5a306e565cdeec70486cdd0fb61f94be8346\": rpc error: code = NotFound desc = could not find container \"6bb690c7233df256233a70b3b55d5a306e565cdeec70486cdd0fb61f94be8346\": container with ID starting with 6bb690c7233df256233a70b3b55d5a306e565cdeec70486cdd0fb61f94be8346 not found: ID does not exist" Nov 29 16:13:02 crc kubenswrapper[4907]: I1129 16:13:02.502600 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d8c8723-c58b-4429-bd49-b28f8a8d0e45" path="/var/lib/kubelet/pods/2d8c8723-c58b-4429-bd49-b28f8a8d0e45/volumes" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.568964 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9x5fh"] Nov 29 16:13:16 crc kubenswrapper[4907]: E1129 16:13:16.569854 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8c8723-c58b-4429-bd49-b28f8a8d0e45" containerName="extract-utilities" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.569865 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8c8723-c58b-4429-bd49-b28f8a8d0e45" containerName="extract-utilities" Nov 29 16:13:16 crc kubenswrapper[4907]: E1129 16:13:16.569894 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8c8723-c58b-4429-bd49-b28f8a8d0e45" containerName="extract-content" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.569900 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8c8723-c58b-4429-bd49-b28f8a8d0e45" containerName="extract-content" Nov 29 16:13:16 crc kubenswrapper[4907]: E1129 16:13:16.569923 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8c8723-c58b-4429-bd49-b28f8a8d0e45" containerName="registry-server" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.569929 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8c8723-c58b-4429-bd49-b28f8a8d0e45" containerName="registry-server" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.570151 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d8c8723-c58b-4429-bd49-b28f8a8d0e45" containerName="registry-server" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.571714 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.605362 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9x5fh"] Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.643161 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-utilities\") pod \"certified-operators-9x5fh\" (UID: \"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe\") " pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.643557 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8jz9\" (UniqueName: \"kubernetes.io/projected/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-kube-api-access-z8jz9\") pod \"certified-operators-9x5fh\" (UID: \"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe\") " pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.643679 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-catalog-content\") pod \"certified-operators-9x5fh\" (UID: \"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe\") " pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.746963 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-utilities\") pod \"certified-operators-9x5fh\" (UID: \"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe\") " pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.747175 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8jz9\" (UniqueName: \"kubernetes.io/projected/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-kube-api-access-z8jz9\") pod \"certified-operators-9x5fh\" (UID: \"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe\") " pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.747214 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-catalog-content\") pod \"certified-operators-9x5fh\" (UID: \"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe\") " pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.747674 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-utilities\") pod \"certified-operators-9x5fh\" (UID: \"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe\") " pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.747695 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-catalog-content\") pod \"certified-operators-9x5fh\" (UID: \"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe\") " pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.772277 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8jz9\" (UniqueName: \"kubernetes.io/projected/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-kube-api-access-z8jz9\") pod \"certified-operators-9x5fh\" (UID: \"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe\") " pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:16 crc kubenswrapper[4907]: I1129 16:13:16.892922 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:17 crc kubenswrapper[4907]: I1129 16:13:17.429048 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9x5fh"] Nov 29 16:13:18 crc kubenswrapper[4907]: I1129 16:13:18.412383 4907 generic.go:334] "Generic (PLEG): container finished" podID="771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe" containerID="51e14dbac5d5600345748397b3fa1048b8afb865c4ff8b50e703e53b63f8174c" exitCode=0 Nov 29 16:13:18 crc kubenswrapper[4907]: I1129 16:13:18.412432 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9x5fh" event={"ID":"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe","Type":"ContainerDied","Data":"51e14dbac5d5600345748397b3fa1048b8afb865c4ff8b50e703e53b63f8174c"} Nov 29 16:13:18 crc kubenswrapper[4907]: I1129 16:13:18.412702 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9x5fh" event={"ID":"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe","Type":"ContainerStarted","Data":"3c262804c9138b79fdca68e942c3baebe7ba0e698008100e9a3ab19561e818a7"} Nov 29 16:13:19 crc kubenswrapper[4907]: I1129 16:13:19.436825 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9x5fh" event={"ID":"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe","Type":"ContainerStarted","Data":"5b56248c08fec766f7343688916488d197fe34f8c04af53b0534ffaeacfe3333"} Nov 29 16:13:20 crc kubenswrapper[4907]: I1129 16:13:20.448487 4907 generic.go:334] "Generic (PLEG): container finished" podID="771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe" containerID="5b56248c08fec766f7343688916488d197fe34f8c04af53b0534ffaeacfe3333" exitCode=0 Nov 29 16:13:20 crc kubenswrapper[4907]: I1129 16:13:20.448526 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9x5fh" event={"ID":"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe","Type":"ContainerDied","Data":"5b56248c08fec766f7343688916488d197fe34f8c04af53b0534ffaeacfe3333"} Nov 29 16:13:22 crc kubenswrapper[4907]: I1129 16:13:22.491177 4907 generic.go:334] "Generic (PLEG): container finished" podID="c1390a38-2afc-4b68-bf02-d2257ac3ef8c" containerID="a8025c3e736baea58edc7c18415d1269582eeb721a82ad994bf328c531aa26b8" exitCode=0 Nov 29 16:13:22 crc kubenswrapper[4907]: I1129 16:13:22.502683 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2tpnk/must-gather-xbmcr" event={"ID":"c1390a38-2afc-4b68-bf02-d2257ac3ef8c","Type":"ContainerDied","Data":"a8025c3e736baea58edc7c18415d1269582eeb721a82ad994bf328c531aa26b8"} Nov 29 16:13:22 crc kubenswrapper[4907]: I1129 16:13:22.502744 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9x5fh" event={"ID":"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe","Type":"ContainerStarted","Data":"f61010994e422a169f2de67596c91ad424755f39a5299611bfbd9c612e48e977"} Nov 29 16:13:22 crc kubenswrapper[4907]: I1129 16:13:22.504195 4907 scope.go:117] "RemoveContainer" containerID="a8025c3e736baea58edc7c18415d1269582eeb721a82ad994bf328c531aa26b8" Nov 29 16:13:22 crc kubenswrapper[4907]: I1129 16:13:22.531797 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9x5fh" podStartSLOduration=3.156355987 podStartE2EDuration="6.531767589s" podCreationTimestamp="2025-11-29 16:13:16 +0000 UTC" firstStartedPulling="2025-11-29 16:13:18.415351658 +0000 UTC m=+6296.402189300" lastFinishedPulling="2025-11-29 16:13:21.79076322 +0000 UTC m=+6299.777600902" observedRunningTime="2025-11-29 16:13:22.5281447 +0000 UTC m=+6300.514982392" watchObservedRunningTime="2025-11-29 16:13:22.531767589 +0000 UTC m=+6300.518605261" Nov 29 16:13:22 crc kubenswrapper[4907]: I1129 16:13:22.884378 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2tpnk_must-gather-xbmcr_c1390a38-2afc-4b68-bf02-d2257ac3ef8c/gather/0.log" Nov 29 16:13:26 crc kubenswrapper[4907]: I1129 16:13:26.893122 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:26 crc kubenswrapper[4907]: I1129 16:13:26.893946 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:26 crc kubenswrapper[4907]: I1129 16:13:26.970599 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:27 crc kubenswrapper[4907]: I1129 16:13:27.644296 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:27 crc kubenswrapper[4907]: I1129 16:13:27.719068 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9x5fh"] Nov 29 16:13:29 crc kubenswrapper[4907]: I1129 16:13:29.591785 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9x5fh" podUID="771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe" containerName="registry-server" containerID="cri-o://f61010994e422a169f2de67596c91ad424755f39a5299611bfbd9c612e48e977" gracePeriod=2 Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.166364 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.345369 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8jz9\" (UniqueName: \"kubernetes.io/projected/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-kube-api-access-z8jz9\") pod \"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe\" (UID: \"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe\") " Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.345504 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-utilities\") pod \"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe\" (UID: \"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe\") " Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.345605 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-catalog-content\") pod \"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe\" (UID: \"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe\") " Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.347253 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-utilities" (OuterVolumeSpecName: "utilities") pod "771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe" (UID: "771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.352503 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-kube-api-access-z8jz9" (OuterVolumeSpecName: "kube-api-access-z8jz9") pod "771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe" (UID: "771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe"). InnerVolumeSpecName "kube-api-access-z8jz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.396200 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe" (UID: "771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.448758 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.448819 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8jz9\" (UniqueName: \"kubernetes.io/projected/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-kube-api-access-z8jz9\") on node \"crc\" DevicePath \"\"" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.448832 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.610469 4907 generic.go:334] "Generic (PLEG): container finished" podID="771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe" containerID="f61010994e422a169f2de67596c91ad424755f39a5299611bfbd9c612e48e977" exitCode=0 Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.610541 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9x5fh" event={"ID":"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe","Type":"ContainerDied","Data":"f61010994e422a169f2de67596c91ad424755f39a5299611bfbd9c612e48e977"} Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.610588 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9x5fh" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.610928 4907 scope.go:117] "RemoveContainer" containerID="f61010994e422a169f2de67596c91ad424755f39a5299611bfbd9c612e48e977" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.610908 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9x5fh" event={"ID":"771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe","Type":"ContainerDied","Data":"3c262804c9138b79fdca68e942c3baebe7ba0e698008100e9a3ab19561e818a7"} Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.650365 4907 scope.go:117] "RemoveContainer" containerID="5b56248c08fec766f7343688916488d197fe34f8c04af53b0534ffaeacfe3333" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.655699 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9x5fh"] Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.668054 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9x5fh"] Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.674434 4907 scope.go:117] "RemoveContainer" containerID="51e14dbac5d5600345748397b3fa1048b8afb865c4ff8b50e703e53b63f8174c" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.734875 4907 scope.go:117] "RemoveContainer" containerID="f61010994e422a169f2de67596c91ad424755f39a5299611bfbd9c612e48e977" Nov 29 16:13:30 crc kubenswrapper[4907]: E1129 16:13:30.735457 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61010994e422a169f2de67596c91ad424755f39a5299611bfbd9c612e48e977\": container with ID starting with f61010994e422a169f2de67596c91ad424755f39a5299611bfbd9c612e48e977 not found: ID does not exist" containerID="f61010994e422a169f2de67596c91ad424755f39a5299611bfbd9c612e48e977" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.735511 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61010994e422a169f2de67596c91ad424755f39a5299611bfbd9c612e48e977"} err="failed to get container status \"f61010994e422a169f2de67596c91ad424755f39a5299611bfbd9c612e48e977\": rpc error: code = NotFound desc = could not find container \"f61010994e422a169f2de67596c91ad424755f39a5299611bfbd9c612e48e977\": container with ID starting with f61010994e422a169f2de67596c91ad424755f39a5299611bfbd9c612e48e977 not found: ID does not exist" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.735543 4907 scope.go:117] "RemoveContainer" containerID="5b56248c08fec766f7343688916488d197fe34f8c04af53b0534ffaeacfe3333" Nov 29 16:13:30 crc kubenswrapper[4907]: E1129 16:13:30.736149 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b56248c08fec766f7343688916488d197fe34f8c04af53b0534ffaeacfe3333\": container with ID starting with 5b56248c08fec766f7343688916488d197fe34f8c04af53b0534ffaeacfe3333 not found: ID does not exist" containerID="5b56248c08fec766f7343688916488d197fe34f8c04af53b0534ffaeacfe3333" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.736202 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b56248c08fec766f7343688916488d197fe34f8c04af53b0534ffaeacfe3333"} err="failed to get container status \"5b56248c08fec766f7343688916488d197fe34f8c04af53b0534ffaeacfe3333\": rpc error: code = NotFound desc = could not find container \"5b56248c08fec766f7343688916488d197fe34f8c04af53b0534ffaeacfe3333\": container with ID starting with 5b56248c08fec766f7343688916488d197fe34f8c04af53b0534ffaeacfe3333 not found: ID does not exist" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.736240 4907 scope.go:117] "RemoveContainer" containerID="51e14dbac5d5600345748397b3fa1048b8afb865c4ff8b50e703e53b63f8174c" Nov 29 16:13:30 crc kubenswrapper[4907]: E1129 16:13:30.736586 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51e14dbac5d5600345748397b3fa1048b8afb865c4ff8b50e703e53b63f8174c\": container with ID starting with 51e14dbac5d5600345748397b3fa1048b8afb865c4ff8b50e703e53b63f8174c not found: ID does not exist" containerID="51e14dbac5d5600345748397b3fa1048b8afb865c4ff8b50e703e53b63f8174c" Nov 29 16:13:30 crc kubenswrapper[4907]: I1129 16:13:30.736669 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51e14dbac5d5600345748397b3fa1048b8afb865c4ff8b50e703e53b63f8174c"} err="failed to get container status \"51e14dbac5d5600345748397b3fa1048b8afb865c4ff8b50e703e53b63f8174c\": rpc error: code = NotFound desc = could not find container \"51e14dbac5d5600345748397b3fa1048b8afb865c4ff8b50e703e53b63f8174c\": container with ID starting with 51e14dbac5d5600345748397b3fa1048b8afb865c4ff8b50e703e53b63f8174c not found: ID does not exist" Nov 29 16:13:31 crc kubenswrapper[4907]: I1129 16:13:31.920282 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2tpnk/must-gather-xbmcr"] Nov 29 16:13:31 crc kubenswrapper[4907]: I1129 16:13:31.921933 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-2tpnk/must-gather-xbmcr" podUID="c1390a38-2afc-4b68-bf02-d2257ac3ef8c" containerName="copy" containerID="cri-o://cf36842898855be5d86c9e2aaf94a99333319cafcd30591a897347da37919403" gracePeriod=2 Nov 29 16:13:31 crc kubenswrapper[4907]: I1129 16:13:31.960424 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2tpnk/must-gather-xbmcr"] Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.044297 4907 scope.go:117] "RemoveContainer" containerID="b3a97e0476c36fc5b45840a203d7cc83a5423c572c354909b5246861e622f099" Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.417649 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2tpnk_must-gather-xbmcr_c1390a38-2afc-4b68-bf02-d2257ac3ef8c/copy/0.log" Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.418314 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/must-gather-xbmcr" Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.493139 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe" path="/var/lib/kubelet/pods/771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe/volumes" Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.497801 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tztg\" (UniqueName: \"kubernetes.io/projected/c1390a38-2afc-4b68-bf02-d2257ac3ef8c-kube-api-access-4tztg\") pod \"c1390a38-2afc-4b68-bf02-d2257ac3ef8c\" (UID: \"c1390a38-2afc-4b68-bf02-d2257ac3ef8c\") " Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.498052 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c1390a38-2afc-4b68-bf02-d2257ac3ef8c-must-gather-output\") pod \"c1390a38-2afc-4b68-bf02-d2257ac3ef8c\" (UID: \"c1390a38-2afc-4b68-bf02-d2257ac3ef8c\") " Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.508275 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1390a38-2afc-4b68-bf02-d2257ac3ef8c-kube-api-access-4tztg" (OuterVolumeSpecName: "kube-api-access-4tztg") pod "c1390a38-2afc-4b68-bf02-d2257ac3ef8c" (UID: "c1390a38-2afc-4b68-bf02-d2257ac3ef8c"). InnerVolumeSpecName "kube-api-access-4tztg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.602171 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tztg\" (UniqueName: \"kubernetes.io/projected/c1390a38-2afc-4b68-bf02-d2257ac3ef8c-kube-api-access-4tztg\") on node \"crc\" DevicePath \"\"" Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.668969 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1390a38-2afc-4b68-bf02-d2257ac3ef8c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c1390a38-2afc-4b68-bf02-d2257ac3ef8c" (UID: "c1390a38-2afc-4b68-bf02-d2257ac3ef8c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.675542 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2tpnk_must-gather-xbmcr_c1390a38-2afc-4b68-bf02-d2257ac3ef8c/copy/0.log" Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.676008 4907 generic.go:334] "Generic (PLEG): container finished" podID="c1390a38-2afc-4b68-bf02-d2257ac3ef8c" containerID="cf36842898855be5d86c9e2aaf94a99333319cafcd30591a897347da37919403" exitCode=143 Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.676053 4907 scope.go:117] "RemoveContainer" containerID="cf36842898855be5d86c9e2aaf94a99333319cafcd30591a897347da37919403" Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.676191 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2tpnk/must-gather-xbmcr" Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.704646 4907 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c1390a38-2afc-4b68-bf02-d2257ac3ef8c-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.712127 4907 scope.go:117] "RemoveContainer" containerID="a8025c3e736baea58edc7c18415d1269582eeb721a82ad994bf328c531aa26b8" Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.760399 4907 scope.go:117] "RemoveContainer" containerID="cf36842898855be5d86c9e2aaf94a99333319cafcd30591a897347da37919403" Nov 29 16:13:32 crc kubenswrapper[4907]: E1129 16:13:32.760875 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf36842898855be5d86c9e2aaf94a99333319cafcd30591a897347da37919403\": container with ID starting with cf36842898855be5d86c9e2aaf94a99333319cafcd30591a897347da37919403 not found: ID does not exist" containerID="cf36842898855be5d86c9e2aaf94a99333319cafcd30591a897347da37919403" Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.760943 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf36842898855be5d86c9e2aaf94a99333319cafcd30591a897347da37919403"} err="failed to get container status \"cf36842898855be5d86c9e2aaf94a99333319cafcd30591a897347da37919403\": rpc error: code = NotFound desc = could not find container \"cf36842898855be5d86c9e2aaf94a99333319cafcd30591a897347da37919403\": container with ID starting with cf36842898855be5d86c9e2aaf94a99333319cafcd30591a897347da37919403 not found: ID does not exist" Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.760977 4907 scope.go:117] "RemoveContainer" containerID="a8025c3e736baea58edc7c18415d1269582eeb721a82ad994bf328c531aa26b8" Nov 29 16:13:32 crc kubenswrapper[4907]: E1129 16:13:32.761368 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8025c3e736baea58edc7c18415d1269582eeb721a82ad994bf328c531aa26b8\": container with ID starting with a8025c3e736baea58edc7c18415d1269582eeb721a82ad994bf328c531aa26b8 not found: ID does not exist" containerID="a8025c3e736baea58edc7c18415d1269582eeb721a82ad994bf328c531aa26b8" Nov 29 16:13:32 crc kubenswrapper[4907]: I1129 16:13:32.761402 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8025c3e736baea58edc7c18415d1269582eeb721a82ad994bf328c531aa26b8"} err="failed to get container status \"a8025c3e736baea58edc7c18415d1269582eeb721a82ad994bf328c531aa26b8\": rpc error: code = NotFound desc = could not find container \"a8025c3e736baea58edc7c18415d1269582eeb721a82ad994bf328c531aa26b8\": container with ID starting with a8025c3e736baea58edc7c18415d1269582eeb721a82ad994bf328c531aa26b8 not found: ID does not exist" Nov 29 16:13:34 crc kubenswrapper[4907]: I1129 16:13:34.490818 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1390a38-2afc-4b68-bf02-d2257ac3ef8c" path="/var/lib/kubelet/pods/c1390a38-2afc-4b68-bf02-d2257ac3ef8c/volumes" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.179486 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qpm94"] Nov 29 16:14:39 crc kubenswrapper[4907]: E1129 16:14:39.180424 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe" containerName="extract-utilities" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.180452 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe" containerName="extract-utilities" Nov 29 16:14:39 crc kubenswrapper[4907]: E1129 16:14:39.180470 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe" containerName="registry-server" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.180477 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe" containerName="registry-server" Nov 29 16:14:39 crc kubenswrapper[4907]: E1129 16:14:39.180512 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1390a38-2afc-4b68-bf02-d2257ac3ef8c" containerName="copy" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.180519 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1390a38-2afc-4b68-bf02-d2257ac3ef8c" containerName="copy" Nov 29 16:14:39 crc kubenswrapper[4907]: E1129 16:14:39.180534 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1390a38-2afc-4b68-bf02-d2257ac3ef8c" containerName="gather" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.180540 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1390a38-2afc-4b68-bf02-d2257ac3ef8c" containerName="gather" Nov 29 16:14:39 crc kubenswrapper[4907]: E1129 16:14:39.180558 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe" containerName="extract-content" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.180563 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe" containerName="extract-content" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.180783 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="771da8be-ab5a-4fd9-b8fc-d4ac5d46dcfe" containerName="registry-server" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.180803 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1390a38-2afc-4b68-bf02-d2257ac3ef8c" containerName="copy" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.181118 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1390a38-2afc-4b68-bf02-d2257ac3ef8c" containerName="gather" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.182829 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.197484 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qpm94"] Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.330265 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c22296-a9d8-4231-8e77-2506d7731ce6-catalog-content\") pod \"redhat-operators-qpm94\" (UID: \"a5c22296-a9d8-4231-8e77-2506d7731ce6\") " pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.330679 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c22296-a9d8-4231-8e77-2506d7731ce6-utilities\") pod \"redhat-operators-qpm94\" (UID: \"a5c22296-a9d8-4231-8e77-2506d7731ce6\") " pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.330768 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fq58\" (UniqueName: \"kubernetes.io/projected/a5c22296-a9d8-4231-8e77-2506d7731ce6-kube-api-access-9fq58\") pod \"redhat-operators-qpm94\" (UID: \"a5c22296-a9d8-4231-8e77-2506d7731ce6\") " pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.432281 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c22296-a9d8-4231-8e77-2506d7731ce6-catalog-content\") pod \"redhat-operators-qpm94\" (UID: \"a5c22296-a9d8-4231-8e77-2506d7731ce6\") " pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.432346 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c22296-a9d8-4231-8e77-2506d7731ce6-utilities\") pod \"redhat-operators-qpm94\" (UID: \"a5c22296-a9d8-4231-8e77-2506d7731ce6\") " pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.432481 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fq58\" (UniqueName: \"kubernetes.io/projected/a5c22296-a9d8-4231-8e77-2506d7731ce6-kube-api-access-9fq58\") pod \"redhat-operators-qpm94\" (UID: \"a5c22296-a9d8-4231-8e77-2506d7731ce6\") " pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.432931 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c22296-a9d8-4231-8e77-2506d7731ce6-catalog-content\") pod \"redhat-operators-qpm94\" (UID: \"a5c22296-a9d8-4231-8e77-2506d7731ce6\") " pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.432942 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c22296-a9d8-4231-8e77-2506d7731ce6-utilities\") pod \"redhat-operators-qpm94\" (UID: \"a5c22296-a9d8-4231-8e77-2506d7731ce6\") " pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.458618 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fq58\" (UniqueName: \"kubernetes.io/projected/a5c22296-a9d8-4231-8e77-2506d7731ce6-kube-api-access-9fq58\") pod \"redhat-operators-qpm94\" (UID: \"a5c22296-a9d8-4231-8e77-2506d7731ce6\") " pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:14:39 crc kubenswrapper[4907]: I1129 16:14:39.541752 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:14:40 crc kubenswrapper[4907]: W1129 16:14:40.066121 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5c22296_a9d8_4231_8e77_2506d7731ce6.slice/crio-f6d81116f42f700a46f172acacf1cc8f6dafa06156abe2df8d58283aed7550f5 WatchSource:0}: Error finding container f6d81116f42f700a46f172acacf1cc8f6dafa06156abe2df8d58283aed7550f5: Status 404 returned error can't find the container with id f6d81116f42f700a46f172acacf1cc8f6dafa06156abe2df8d58283aed7550f5 Nov 29 16:14:40 crc kubenswrapper[4907]: I1129 16:14:40.070138 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qpm94"] Nov 29 16:14:40 crc kubenswrapper[4907]: I1129 16:14:40.581170 4907 generic.go:334] "Generic (PLEG): container finished" podID="a5c22296-a9d8-4231-8e77-2506d7731ce6" containerID="5f9f0a454fc7598306a7b56ebadf306882797982d893377f3f84dbcdfbc406e7" exitCode=0 Nov 29 16:14:40 crc kubenswrapper[4907]: I1129 16:14:40.581506 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpm94" event={"ID":"a5c22296-a9d8-4231-8e77-2506d7731ce6","Type":"ContainerDied","Data":"5f9f0a454fc7598306a7b56ebadf306882797982d893377f3f84dbcdfbc406e7"} Nov 29 16:14:40 crc kubenswrapper[4907]: I1129 16:14:40.581534 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpm94" event={"ID":"a5c22296-a9d8-4231-8e77-2506d7731ce6","Type":"ContainerStarted","Data":"f6d81116f42f700a46f172acacf1cc8f6dafa06156abe2df8d58283aed7550f5"} Nov 29 16:14:41 crc kubenswrapper[4907]: I1129 16:14:41.607257 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpm94" event={"ID":"a5c22296-a9d8-4231-8e77-2506d7731ce6","Type":"ContainerStarted","Data":"09e2f79c100aca15725b90f59bda23cf8839b79dc0f5e8e356982f82e457f31c"} Nov 29 16:14:44 crc kubenswrapper[4907]: I1129 16:14:44.646816 4907 generic.go:334] "Generic (PLEG): container finished" podID="a5c22296-a9d8-4231-8e77-2506d7731ce6" containerID="09e2f79c100aca15725b90f59bda23cf8839b79dc0f5e8e356982f82e457f31c" exitCode=0 Nov 29 16:14:44 crc kubenswrapper[4907]: I1129 16:14:44.646908 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpm94" event={"ID":"a5c22296-a9d8-4231-8e77-2506d7731ce6","Type":"ContainerDied","Data":"09e2f79c100aca15725b90f59bda23cf8839b79dc0f5e8e356982f82e457f31c"} Nov 29 16:14:45 crc kubenswrapper[4907]: I1129 16:14:45.661845 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpm94" event={"ID":"a5c22296-a9d8-4231-8e77-2506d7731ce6","Type":"ContainerStarted","Data":"3eb34ac4ca89640781611e41205c165bb432e87496ed743ec9e6e46d852c65e5"} Nov 29 16:14:49 crc kubenswrapper[4907]: I1129 16:14:49.542342 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:14:49 crc kubenswrapper[4907]: I1129 16:14:49.543460 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:14:50 crc kubenswrapper[4907]: I1129 16:14:50.611391 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qpm94" podUID="a5c22296-a9d8-4231-8e77-2506d7731ce6" containerName="registry-server" probeResult="failure" output=< Nov 29 16:14:50 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 16:14:50 crc kubenswrapper[4907]: > Nov 29 16:14:58 crc kubenswrapper[4907]: I1129 16:14:58.492562 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 16:14:58 crc kubenswrapper[4907]: I1129 16:14:58.493039 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 16:14:59 crc kubenswrapper[4907]: I1129 16:14:59.625635 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:14:59 crc kubenswrapper[4907]: I1129 16:14:59.647228 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qpm94" podStartSLOduration=15.968005372 podStartE2EDuration="20.647200927s" podCreationTimestamp="2025-11-29 16:14:39 +0000 UTC" firstStartedPulling="2025-11-29 16:14:40.583676685 +0000 UTC m=+6378.570514337" lastFinishedPulling="2025-11-29 16:14:45.2628722 +0000 UTC m=+6383.249709892" observedRunningTime="2025-11-29 16:14:45.688900112 +0000 UTC m=+6383.675737774" watchObservedRunningTime="2025-11-29 16:14:59.647200927 +0000 UTC m=+6397.634038619" Nov 29 16:14:59 crc kubenswrapper[4907]: I1129 16:14:59.695283 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:14:59 crc kubenswrapper[4907]: I1129 16:14:59.879366 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qpm94"] Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.170578 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f"] Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.173305 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.181389 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.186581 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.188896 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f"] Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.287139 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr6xc\" (UniqueName: \"kubernetes.io/projected/14bc7855-29a5-4bde-b008-6377ff4a8c14-kube-api-access-pr6xc\") pod \"collect-profiles-29407215-pxw8f\" (UID: \"14bc7855-29a5-4bde-b008-6377ff4a8c14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.287280 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14bc7855-29a5-4bde-b008-6377ff4a8c14-secret-volume\") pod \"collect-profiles-29407215-pxw8f\" (UID: \"14bc7855-29a5-4bde-b008-6377ff4a8c14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.287507 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14bc7855-29a5-4bde-b008-6377ff4a8c14-config-volume\") pod \"collect-profiles-29407215-pxw8f\" (UID: \"14bc7855-29a5-4bde-b008-6377ff4a8c14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.389633 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14bc7855-29a5-4bde-b008-6377ff4a8c14-config-volume\") pod \"collect-profiles-29407215-pxw8f\" (UID: \"14bc7855-29a5-4bde-b008-6377ff4a8c14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.389808 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr6xc\" (UniqueName: \"kubernetes.io/projected/14bc7855-29a5-4bde-b008-6377ff4a8c14-kube-api-access-pr6xc\") pod \"collect-profiles-29407215-pxw8f\" (UID: \"14bc7855-29a5-4bde-b008-6377ff4a8c14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.389895 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14bc7855-29a5-4bde-b008-6377ff4a8c14-secret-volume\") pod \"collect-profiles-29407215-pxw8f\" (UID: \"14bc7855-29a5-4bde-b008-6377ff4a8c14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.391141 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14bc7855-29a5-4bde-b008-6377ff4a8c14-config-volume\") pod \"collect-profiles-29407215-pxw8f\" (UID: \"14bc7855-29a5-4bde-b008-6377ff4a8c14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.397350 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14bc7855-29a5-4bde-b008-6377ff4a8c14-secret-volume\") pod \"collect-profiles-29407215-pxw8f\" (UID: \"14bc7855-29a5-4bde-b008-6377ff4a8c14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.406503 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr6xc\" (UniqueName: \"kubernetes.io/projected/14bc7855-29a5-4bde-b008-6377ff4a8c14-kube-api-access-pr6xc\") pod \"collect-profiles-29407215-pxw8f\" (UID: \"14bc7855-29a5-4bde-b008-6377ff4a8c14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.507463 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" Nov 29 16:15:00 crc kubenswrapper[4907]: I1129 16:15:00.861185 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qpm94" podUID="a5c22296-a9d8-4231-8e77-2506d7731ce6" containerName="registry-server" containerID="cri-o://3eb34ac4ca89640781611e41205c165bb432e87496ed743ec9e6e46d852c65e5" gracePeriod=2 Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.013289 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f"] Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.261939 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.309329 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c22296-a9d8-4231-8e77-2506d7731ce6-utilities\") pod \"a5c22296-a9d8-4231-8e77-2506d7731ce6\" (UID: \"a5c22296-a9d8-4231-8e77-2506d7731ce6\") " Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.309397 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c22296-a9d8-4231-8e77-2506d7731ce6-catalog-content\") pod \"a5c22296-a9d8-4231-8e77-2506d7731ce6\" (UID: \"a5c22296-a9d8-4231-8e77-2506d7731ce6\") " Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.309493 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fq58\" (UniqueName: \"kubernetes.io/projected/a5c22296-a9d8-4231-8e77-2506d7731ce6-kube-api-access-9fq58\") pod \"a5c22296-a9d8-4231-8e77-2506d7731ce6\" (UID: \"a5c22296-a9d8-4231-8e77-2506d7731ce6\") " Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.310370 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5c22296-a9d8-4231-8e77-2506d7731ce6-utilities" (OuterVolumeSpecName: "utilities") pod "a5c22296-a9d8-4231-8e77-2506d7731ce6" (UID: "a5c22296-a9d8-4231-8e77-2506d7731ce6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.316245 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5c22296-a9d8-4231-8e77-2506d7731ce6-kube-api-access-9fq58" (OuterVolumeSpecName: "kube-api-access-9fq58") pod "a5c22296-a9d8-4231-8e77-2506d7731ce6" (UID: "a5c22296-a9d8-4231-8e77-2506d7731ce6"). InnerVolumeSpecName "kube-api-access-9fq58". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.412764 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fq58\" (UniqueName: \"kubernetes.io/projected/a5c22296-a9d8-4231-8e77-2506d7731ce6-kube-api-access-9fq58\") on node \"crc\" DevicePath \"\"" Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.412987 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5c22296-a9d8-4231-8e77-2506d7731ce6-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.431190 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5c22296-a9d8-4231-8e77-2506d7731ce6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5c22296-a9d8-4231-8e77-2506d7731ce6" (UID: "a5c22296-a9d8-4231-8e77-2506d7731ce6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.516147 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5c22296-a9d8-4231-8e77-2506d7731ce6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.876160 4907 generic.go:334] "Generic (PLEG): container finished" podID="14bc7855-29a5-4bde-b008-6377ff4a8c14" containerID="73394f5516b8a67cbdf1f146439079075064ace9fde0fac9f2810c4ff9791c54" exitCode=0 Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.876258 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" event={"ID":"14bc7855-29a5-4bde-b008-6377ff4a8c14","Type":"ContainerDied","Data":"73394f5516b8a67cbdf1f146439079075064ace9fde0fac9f2810c4ff9791c54"} Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.876286 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" event={"ID":"14bc7855-29a5-4bde-b008-6377ff4a8c14","Type":"ContainerStarted","Data":"acb91b7e58452dde890b4efa12c20cf4d77677c392d3fd5d61581919385fb455"} Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.880518 4907 generic.go:334] "Generic (PLEG): container finished" podID="a5c22296-a9d8-4231-8e77-2506d7731ce6" containerID="3eb34ac4ca89640781611e41205c165bb432e87496ed743ec9e6e46d852c65e5" exitCode=0 Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.880602 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qpm94" Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.880603 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpm94" event={"ID":"a5c22296-a9d8-4231-8e77-2506d7731ce6","Type":"ContainerDied","Data":"3eb34ac4ca89640781611e41205c165bb432e87496ed743ec9e6e46d852c65e5"} Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.880945 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qpm94" event={"ID":"a5c22296-a9d8-4231-8e77-2506d7731ce6","Type":"ContainerDied","Data":"f6d81116f42f700a46f172acacf1cc8f6dafa06156abe2df8d58283aed7550f5"} Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.880969 4907 scope.go:117] "RemoveContainer" containerID="3eb34ac4ca89640781611e41205c165bb432e87496ed743ec9e6e46d852c65e5" Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.921490 4907 scope.go:117] "RemoveContainer" containerID="09e2f79c100aca15725b90f59bda23cf8839b79dc0f5e8e356982f82e457f31c" Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.948180 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qpm94"] Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.953000 4907 scope.go:117] "RemoveContainer" containerID="5f9f0a454fc7598306a7b56ebadf306882797982d893377f3f84dbcdfbc406e7" Nov 29 16:15:01 crc kubenswrapper[4907]: I1129 16:15:01.963945 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qpm94"] Nov 29 16:15:02 crc kubenswrapper[4907]: I1129 16:15:02.023706 4907 scope.go:117] "RemoveContainer" containerID="3eb34ac4ca89640781611e41205c165bb432e87496ed743ec9e6e46d852c65e5" Nov 29 16:15:02 crc kubenswrapper[4907]: E1129 16:15:02.024212 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eb34ac4ca89640781611e41205c165bb432e87496ed743ec9e6e46d852c65e5\": container with ID starting with 3eb34ac4ca89640781611e41205c165bb432e87496ed743ec9e6e46d852c65e5 not found: ID does not exist" containerID="3eb34ac4ca89640781611e41205c165bb432e87496ed743ec9e6e46d852c65e5" Nov 29 16:15:02 crc kubenswrapper[4907]: I1129 16:15:02.024272 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb34ac4ca89640781611e41205c165bb432e87496ed743ec9e6e46d852c65e5"} err="failed to get container status \"3eb34ac4ca89640781611e41205c165bb432e87496ed743ec9e6e46d852c65e5\": rpc error: code = NotFound desc = could not find container \"3eb34ac4ca89640781611e41205c165bb432e87496ed743ec9e6e46d852c65e5\": container with ID starting with 3eb34ac4ca89640781611e41205c165bb432e87496ed743ec9e6e46d852c65e5 not found: ID does not exist" Nov 29 16:15:02 crc kubenswrapper[4907]: I1129 16:15:02.024308 4907 scope.go:117] "RemoveContainer" containerID="09e2f79c100aca15725b90f59bda23cf8839b79dc0f5e8e356982f82e457f31c" Nov 29 16:15:02 crc kubenswrapper[4907]: E1129 16:15:02.024763 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e2f79c100aca15725b90f59bda23cf8839b79dc0f5e8e356982f82e457f31c\": container with ID starting with 09e2f79c100aca15725b90f59bda23cf8839b79dc0f5e8e356982f82e457f31c not found: ID does not exist" containerID="09e2f79c100aca15725b90f59bda23cf8839b79dc0f5e8e356982f82e457f31c" Nov 29 16:15:02 crc kubenswrapper[4907]: I1129 16:15:02.024807 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e2f79c100aca15725b90f59bda23cf8839b79dc0f5e8e356982f82e457f31c"} err="failed to get container status \"09e2f79c100aca15725b90f59bda23cf8839b79dc0f5e8e356982f82e457f31c\": rpc error: code = NotFound desc = could not find container \"09e2f79c100aca15725b90f59bda23cf8839b79dc0f5e8e356982f82e457f31c\": container with ID starting with 09e2f79c100aca15725b90f59bda23cf8839b79dc0f5e8e356982f82e457f31c not found: ID does not exist" Nov 29 16:15:02 crc kubenswrapper[4907]: I1129 16:15:02.024836 4907 scope.go:117] "RemoveContainer" containerID="5f9f0a454fc7598306a7b56ebadf306882797982d893377f3f84dbcdfbc406e7" Nov 29 16:15:02 crc kubenswrapper[4907]: E1129 16:15:02.025139 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f9f0a454fc7598306a7b56ebadf306882797982d893377f3f84dbcdfbc406e7\": container with ID starting with 5f9f0a454fc7598306a7b56ebadf306882797982d893377f3f84dbcdfbc406e7 not found: ID does not exist" containerID="5f9f0a454fc7598306a7b56ebadf306882797982d893377f3f84dbcdfbc406e7" Nov 29 16:15:02 crc kubenswrapper[4907]: I1129 16:15:02.025237 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f9f0a454fc7598306a7b56ebadf306882797982d893377f3f84dbcdfbc406e7"} err="failed to get container status \"5f9f0a454fc7598306a7b56ebadf306882797982d893377f3f84dbcdfbc406e7\": rpc error: code = NotFound desc = could not find container \"5f9f0a454fc7598306a7b56ebadf306882797982d893377f3f84dbcdfbc406e7\": container with ID starting with 5f9f0a454fc7598306a7b56ebadf306882797982d893377f3f84dbcdfbc406e7 not found: ID does not exist" Nov 29 16:15:02 crc kubenswrapper[4907]: I1129 16:15:02.504363 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5c22296-a9d8-4231-8e77-2506d7731ce6" path="/var/lib/kubelet/pods/a5c22296-a9d8-4231-8e77-2506d7731ce6/volumes" Nov 29 16:15:03 crc kubenswrapper[4907]: I1129 16:15:03.361863 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" Nov 29 16:15:03 crc kubenswrapper[4907]: I1129 16:15:03.466346 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr6xc\" (UniqueName: \"kubernetes.io/projected/14bc7855-29a5-4bde-b008-6377ff4a8c14-kube-api-access-pr6xc\") pod \"14bc7855-29a5-4bde-b008-6377ff4a8c14\" (UID: \"14bc7855-29a5-4bde-b008-6377ff4a8c14\") " Nov 29 16:15:03 crc kubenswrapper[4907]: I1129 16:15:03.466474 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14bc7855-29a5-4bde-b008-6377ff4a8c14-config-volume\") pod \"14bc7855-29a5-4bde-b008-6377ff4a8c14\" (UID: \"14bc7855-29a5-4bde-b008-6377ff4a8c14\") " Nov 29 16:15:03 crc kubenswrapper[4907]: I1129 16:15:03.466592 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14bc7855-29a5-4bde-b008-6377ff4a8c14-secret-volume\") pod \"14bc7855-29a5-4bde-b008-6377ff4a8c14\" (UID: \"14bc7855-29a5-4bde-b008-6377ff4a8c14\") " Nov 29 16:15:03 crc kubenswrapper[4907]: I1129 16:15:03.467557 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14bc7855-29a5-4bde-b008-6377ff4a8c14-config-volume" (OuterVolumeSpecName: "config-volume") pod "14bc7855-29a5-4bde-b008-6377ff4a8c14" (UID: "14bc7855-29a5-4bde-b008-6377ff4a8c14"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 16:15:03 crc kubenswrapper[4907]: I1129 16:15:03.472210 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14bc7855-29a5-4bde-b008-6377ff4a8c14-kube-api-access-pr6xc" (OuterVolumeSpecName: "kube-api-access-pr6xc") pod "14bc7855-29a5-4bde-b008-6377ff4a8c14" (UID: "14bc7855-29a5-4bde-b008-6377ff4a8c14"). InnerVolumeSpecName "kube-api-access-pr6xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:15:03 crc kubenswrapper[4907]: I1129 16:15:03.476347 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14bc7855-29a5-4bde-b008-6377ff4a8c14-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "14bc7855-29a5-4bde-b008-6377ff4a8c14" (UID: "14bc7855-29a5-4bde-b008-6377ff4a8c14"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 16:15:03 crc kubenswrapper[4907]: I1129 16:15:03.570723 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr6xc\" (UniqueName: \"kubernetes.io/projected/14bc7855-29a5-4bde-b008-6377ff4a8c14-kube-api-access-pr6xc\") on node \"crc\" DevicePath \"\"" Nov 29 16:15:03 crc kubenswrapper[4907]: I1129 16:15:03.570773 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14bc7855-29a5-4bde-b008-6377ff4a8c14-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 16:15:03 crc kubenswrapper[4907]: I1129 16:15:03.570792 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14bc7855-29a5-4bde-b008-6377ff4a8c14-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 16:15:03 crc kubenswrapper[4907]: I1129 16:15:03.921144 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" event={"ID":"14bc7855-29a5-4bde-b008-6377ff4a8c14","Type":"ContainerDied","Data":"acb91b7e58452dde890b4efa12c20cf4d77677c392d3fd5d61581919385fb455"} Nov 29 16:15:03 crc kubenswrapper[4907]: I1129 16:15:03.921191 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acb91b7e58452dde890b4efa12c20cf4d77677c392d3fd5d61581919385fb455" Nov 29 16:15:03 crc kubenswrapper[4907]: I1129 16:15:03.921271 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29407215-pxw8f" Nov 29 16:15:04 crc kubenswrapper[4907]: I1129 16:15:04.497748 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27"] Nov 29 16:15:04 crc kubenswrapper[4907]: I1129 16:15:04.497812 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29407170-pcd27"] Nov 29 16:15:06 crc kubenswrapper[4907]: I1129 16:15:06.510831 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73b2ed79-7759-486a-bc40-c260b58ae2fa" path="/var/lib/kubelet/pods/73b2ed79-7759-486a-bc40-c260b58ae2fa/volumes" Nov 29 16:15:28 crc kubenswrapper[4907]: I1129 16:15:28.490674 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 16:15:28 crc kubenswrapper[4907]: I1129 16:15:28.491123 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 16:15:32 crc kubenswrapper[4907]: I1129 16:15:32.270874 4907 scope.go:117] "RemoveContainer" containerID="400e58cf5254f729712a980e364110b1cb31120063a998829a3cd29bbbc59e9f" Nov 29 16:15:58 crc kubenswrapper[4907]: I1129 16:15:58.490399 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 16:15:58 crc kubenswrapper[4907]: I1129 16:15:58.490977 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 16:15:58 crc kubenswrapper[4907]: I1129 16:15:58.491027 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 16:15:58 crc kubenswrapper[4907]: I1129 16:15:58.492016 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 16:15:58 crc kubenswrapper[4907]: I1129 16:15:58.492070 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" gracePeriod=600 Nov 29 16:15:58 crc kubenswrapper[4907]: E1129 16:15:58.615508 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:15:58 crc kubenswrapper[4907]: I1129 16:15:58.726226 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" exitCode=0 Nov 29 16:15:58 crc kubenswrapper[4907]: I1129 16:15:58.726341 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4"} Nov 29 16:15:58 crc kubenswrapper[4907]: I1129 16:15:58.726544 4907 scope.go:117] "RemoveContainer" containerID="5c9e246176adde8029f4d2d0175676987ac7f82b41e64c176c55b0095d1e178e" Nov 29 16:15:58 crc kubenswrapper[4907]: I1129 16:15:58.727804 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:15:58 crc kubenswrapper[4907]: E1129 16:15:58.728513 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:16:13 crc kubenswrapper[4907]: I1129 16:16:13.479980 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:16:13 crc kubenswrapper[4907]: E1129 16:16:13.480838 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.089392 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dxjdc/must-gather-lgmwk"] Nov 29 16:16:16 crc kubenswrapper[4907]: E1129 16:16:16.090179 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14bc7855-29a5-4bde-b008-6377ff4a8c14" containerName="collect-profiles" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.090192 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="14bc7855-29a5-4bde-b008-6377ff4a8c14" containerName="collect-profiles" Nov 29 16:16:16 crc kubenswrapper[4907]: E1129 16:16:16.090225 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c22296-a9d8-4231-8e77-2506d7731ce6" containerName="extract-utilities" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.090231 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c22296-a9d8-4231-8e77-2506d7731ce6" containerName="extract-utilities" Nov 29 16:16:16 crc kubenswrapper[4907]: E1129 16:16:16.090251 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c22296-a9d8-4231-8e77-2506d7731ce6" containerName="registry-server" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.090257 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c22296-a9d8-4231-8e77-2506d7731ce6" containerName="registry-server" Nov 29 16:16:16 crc kubenswrapper[4907]: E1129 16:16:16.090271 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c22296-a9d8-4231-8e77-2506d7731ce6" containerName="extract-content" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.090276 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c22296-a9d8-4231-8e77-2506d7731ce6" containerName="extract-content" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.090515 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c22296-a9d8-4231-8e77-2506d7731ce6" containerName="registry-server" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.090546 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="14bc7855-29a5-4bde-b008-6377ff4a8c14" containerName="collect-profiles" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.096640 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/must-gather-lgmwk" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.101322 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dxjdc"/"openshift-service-ca.crt" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.101397 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-dxjdc"/"default-dockercfg-rjmfp" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.101336 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dxjdc"/"kube-root-ca.crt" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.126915 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dxjdc/must-gather-lgmwk"] Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.199766 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3f55f8cf-3b4e-4370-9980-0958693dc72c-must-gather-output\") pod \"must-gather-lgmwk\" (UID: \"3f55f8cf-3b4e-4370-9980-0958693dc72c\") " pod="openshift-must-gather-dxjdc/must-gather-lgmwk" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.200077 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnlr8\" (UniqueName: \"kubernetes.io/projected/3f55f8cf-3b4e-4370-9980-0958693dc72c-kube-api-access-rnlr8\") pod \"must-gather-lgmwk\" (UID: \"3f55f8cf-3b4e-4370-9980-0958693dc72c\") " pod="openshift-must-gather-dxjdc/must-gather-lgmwk" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.301797 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3f55f8cf-3b4e-4370-9980-0958693dc72c-must-gather-output\") pod \"must-gather-lgmwk\" (UID: \"3f55f8cf-3b4e-4370-9980-0958693dc72c\") " pod="openshift-must-gather-dxjdc/must-gather-lgmwk" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.302149 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3f55f8cf-3b4e-4370-9980-0958693dc72c-must-gather-output\") pod \"must-gather-lgmwk\" (UID: \"3f55f8cf-3b4e-4370-9980-0958693dc72c\") " pod="openshift-must-gather-dxjdc/must-gather-lgmwk" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.302211 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnlr8\" (UniqueName: \"kubernetes.io/projected/3f55f8cf-3b4e-4370-9980-0958693dc72c-kube-api-access-rnlr8\") pod \"must-gather-lgmwk\" (UID: \"3f55f8cf-3b4e-4370-9980-0958693dc72c\") " pod="openshift-must-gather-dxjdc/must-gather-lgmwk" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.323997 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnlr8\" (UniqueName: \"kubernetes.io/projected/3f55f8cf-3b4e-4370-9980-0958693dc72c-kube-api-access-rnlr8\") pod \"must-gather-lgmwk\" (UID: \"3f55f8cf-3b4e-4370-9980-0958693dc72c\") " pod="openshift-must-gather-dxjdc/must-gather-lgmwk" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.416206 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/must-gather-lgmwk" Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.969733 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dxjdc/must-gather-lgmwk"] Nov 29 16:16:16 crc kubenswrapper[4907]: I1129 16:16:16.982180 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxjdc/must-gather-lgmwk" event={"ID":"3f55f8cf-3b4e-4370-9980-0958693dc72c","Type":"ContainerStarted","Data":"b8002557040ad97ff7932c91010a9eda5a83df212c4215a06ecd244d9bf6e6ed"} Nov 29 16:16:17 crc kubenswrapper[4907]: I1129 16:16:17.999140 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxjdc/must-gather-lgmwk" event={"ID":"3f55f8cf-3b4e-4370-9980-0958693dc72c","Type":"ContainerStarted","Data":"2018ebe42f9be0d2863275b7dfd8e130ea1c9c8d71b378d9327f9bd755084201"} Nov 29 16:16:17 crc kubenswrapper[4907]: I1129 16:16:17.999755 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxjdc/must-gather-lgmwk" event={"ID":"3f55f8cf-3b4e-4370-9980-0958693dc72c","Type":"ContainerStarted","Data":"55578ea83743f1b31699c3689fe6a11811d7a505c74ee9110d9265bb49b19011"} Nov 29 16:16:18 crc kubenswrapper[4907]: I1129 16:16:18.036882 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dxjdc/must-gather-lgmwk" podStartSLOduration=2.036836171 podStartE2EDuration="2.036836171s" podCreationTimestamp="2025-11-29 16:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 16:16:18.028765189 +0000 UTC m=+6476.015602851" watchObservedRunningTime="2025-11-29 16:16:18.036836171 +0000 UTC m=+6476.023673863" Nov 29 16:16:20 crc kubenswrapper[4907]: E1129 16:16:20.646823 4907 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.47:48348->38.102.83.47:43783: read tcp 38.102.83.47:48348->38.102.83.47:43783: read: connection reset by peer Nov 29 16:16:21 crc kubenswrapper[4907]: I1129 16:16:21.133416 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dxjdc/crc-debug-rlm7r"] Nov 29 16:16:21 crc kubenswrapper[4907]: I1129 16:16:21.137044 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/crc-debug-rlm7r" Nov 29 16:16:21 crc kubenswrapper[4907]: I1129 16:16:21.261071 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/951677d6-03ef-4152-9282-da3bee50c0c1-host\") pod \"crc-debug-rlm7r\" (UID: \"951677d6-03ef-4152-9282-da3bee50c0c1\") " pod="openshift-must-gather-dxjdc/crc-debug-rlm7r" Nov 29 16:16:21 crc kubenswrapper[4907]: I1129 16:16:21.261811 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkh69\" (UniqueName: \"kubernetes.io/projected/951677d6-03ef-4152-9282-da3bee50c0c1-kube-api-access-qkh69\") pod \"crc-debug-rlm7r\" (UID: \"951677d6-03ef-4152-9282-da3bee50c0c1\") " pod="openshift-must-gather-dxjdc/crc-debug-rlm7r" Nov 29 16:16:21 crc kubenswrapper[4907]: I1129 16:16:21.364486 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkh69\" (UniqueName: \"kubernetes.io/projected/951677d6-03ef-4152-9282-da3bee50c0c1-kube-api-access-qkh69\") pod \"crc-debug-rlm7r\" (UID: \"951677d6-03ef-4152-9282-da3bee50c0c1\") " pod="openshift-must-gather-dxjdc/crc-debug-rlm7r" Nov 29 16:16:21 crc kubenswrapper[4907]: I1129 16:16:21.364704 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/951677d6-03ef-4152-9282-da3bee50c0c1-host\") pod \"crc-debug-rlm7r\" (UID: \"951677d6-03ef-4152-9282-da3bee50c0c1\") " pod="openshift-must-gather-dxjdc/crc-debug-rlm7r" Nov 29 16:16:21 crc kubenswrapper[4907]: I1129 16:16:21.364896 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/951677d6-03ef-4152-9282-da3bee50c0c1-host\") pod \"crc-debug-rlm7r\" (UID: \"951677d6-03ef-4152-9282-da3bee50c0c1\") " pod="openshift-must-gather-dxjdc/crc-debug-rlm7r" Nov 29 16:16:21 crc kubenswrapper[4907]: I1129 16:16:21.400601 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkh69\" (UniqueName: \"kubernetes.io/projected/951677d6-03ef-4152-9282-da3bee50c0c1-kube-api-access-qkh69\") pod \"crc-debug-rlm7r\" (UID: \"951677d6-03ef-4152-9282-da3bee50c0c1\") " pod="openshift-must-gather-dxjdc/crc-debug-rlm7r" Nov 29 16:16:21 crc kubenswrapper[4907]: I1129 16:16:21.458958 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/crc-debug-rlm7r" Nov 29 16:16:21 crc kubenswrapper[4907]: W1129 16:16:21.493772 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod951677d6_03ef_4152_9282_da3bee50c0c1.slice/crio-9cb82c76c5ca9032405441bb728ef230caccf635247f9462b4c6935babd78ca3 WatchSource:0}: Error finding container 9cb82c76c5ca9032405441bb728ef230caccf635247f9462b4c6935babd78ca3: Status 404 returned error can't find the container with id 9cb82c76c5ca9032405441bb728ef230caccf635247f9462b4c6935babd78ca3 Nov 29 16:16:22 crc kubenswrapper[4907]: I1129 16:16:22.073939 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxjdc/crc-debug-rlm7r" event={"ID":"951677d6-03ef-4152-9282-da3bee50c0c1","Type":"ContainerStarted","Data":"da216dcb8918ddbdbdcf8db295b82e9b8d1a1789273885fcb4c5397592076e4b"} Nov 29 16:16:22 crc kubenswrapper[4907]: I1129 16:16:22.075417 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxjdc/crc-debug-rlm7r" event={"ID":"951677d6-03ef-4152-9282-da3bee50c0c1","Type":"ContainerStarted","Data":"9cb82c76c5ca9032405441bb728ef230caccf635247f9462b4c6935babd78ca3"} Nov 29 16:16:22 crc kubenswrapper[4907]: I1129 16:16:22.106320 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dxjdc/crc-debug-rlm7r" podStartSLOduration=1.106300157 podStartE2EDuration="1.106300157s" podCreationTimestamp="2025-11-29 16:16:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 16:16:22.086724831 +0000 UTC m=+6480.073562483" watchObservedRunningTime="2025-11-29 16:16:22.106300157 +0000 UTC m=+6480.093137809" Nov 29 16:16:26 crc kubenswrapper[4907]: I1129 16:16:26.479736 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:16:26 crc kubenswrapper[4907]: E1129 16:16:26.480538 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:16:37 crc kubenswrapper[4907]: I1129 16:16:37.480137 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:16:37 crc kubenswrapper[4907]: E1129 16:16:37.480927 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:16:50 crc kubenswrapper[4907]: I1129 16:16:50.480038 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:16:50 crc kubenswrapper[4907]: E1129 16:16:50.480840 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:17:04 crc kubenswrapper[4907]: I1129 16:17:04.480201 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:17:04 crc kubenswrapper[4907]: E1129 16:17:04.481026 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:17:06 crc kubenswrapper[4907]: I1129 16:17:06.578109 4907 generic.go:334] "Generic (PLEG): container finished" podID="951677d6-03ef-4152-9282-da3bee50c0c1" containerID="da216dcb8918ddbdbdcf8db295b82e9b8d1a1789273885fcb4c5397592076e4b" exitCode=0 Nov 29 16:17:06 crc kubenswrapper[4907]: I1129 16:17:06.578247 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxjdc/crc-debug-rlm7r" event={"ID":"951677d6-03ef-4152-9282-da3bee50c0c1","Type":"ContainerDied","Data":"da216dcb8918ddbdbdcf8db295b82e9b8d1a1789273885fcb4c5397592076e4b"} Nov 29 16:17:07 crc kubenswrapper[4907]: I1129 16:17:07.723975 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/crc-debug-rlm7r" Nov 29 16:17:07 crc kubenswrapper[4907]: I1129 16:17:07.789719 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dxjdc/crc-debug-rlm7r"] Nov 29 16:17:07 crc kubenswrapper[4907]: I1129 16:17:07.802534 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dxjdc/crc-debug-rlm7r"] Nov 29 16:17:07 crc kubenswrapper[4907]: I1129 16:17:07.825904 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/951677d6-03ef-4152-9282-da3bee50c0c1-host\") pod \"951677d6-03ef-4152-9282-da3bee50c0c1\" (UID: \"951677d6-03ef-4152-9282-da3bee50c0c1\") " Nov 29 16:17:07 crc kubenswrapper[4907]: I1129 16:17:07.825988 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkh69\" (UniqueName: \"kubernetes.io/projected/951677d6-03ef-4152-9282-da3bee50c0c1-kube-api-access-qkh69\") pod \"951677d6-03ef-4152-9282-da3bee50c0c1\" (UID: \"951677d6-03ef-4152-9282-da3bee50c0c1\") " Nov 29 16:17:07 crc kubenswrapper[4907]: I1129 16:17:07.826556 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/951677d6-03ef-4152-9282-da3bee50c0c1-host" (OuterVolumeSpecName: "host") pod "951677d6-03ef-4152-9282-da3bee50c0c1" (UID: "951677d6-03ef-4152-9282-da3bee50c0c1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 16:17:07 crc kubenswrapper[4907]: I1129 16:17:07.826652 4907 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/951677d6-03ef-4152-9282-da3bee50c0c1-host\") on node \"crc\" DevicePath \"\"" Nov 29 16:17:07 crc kubenswrapper[4907]: I1129 16:17:07.832681 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/951677d6-03ef-4152-9282-da3bee50c0c1-kube-api-access-qkh69" (OuterVolumeSpecName: "kube-api-access-qkh69") pod "951677d6-03ef-4152-9282-da3bee50c0c1" (UID: "951677d6-03ef-4152-9282-da3bee50c0c1"). InnerVolumeSpecName "kube-api-access-qkh69". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:17:07 crc kubenswrapper[4907]: I1129 16:17:07.928509 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkh69\" (UniqueName: \"kubernetes.io/projected/951677d6-03ef-4152-9282-da3bee50c0c1-kube-api-access-qkh69\") on node \"crc\" DevicePath \"\"" Nov 29 16:17:08 crc kubenswrapper[4907]: I1129 16:17:08.502150 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="951677d6-03ef-4152-9282-da3bee50c0c1" path="/var/lib/kubelet/pods/951677d6-03ef-4152-9282-da3bee50c0c1/volumes" Nov 29 16:17:08 crc kubenswrapper[4907]: I1129 16:17:08.602114 4907 scope.go:117] "RemoveContainer" containerID="da216dcb8918ddbdbdcf8db295b82e9b8d1a1789273885fcb4c5397592076e4b" Nov 29 16:17:08 crc kubenswrapper[4907]: I1129 16:17:08.602209 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/crc-debug-rlm7r" Nov 29 16:17:09 crc kubenswrapper[4907]: I1129 16:17:09.053401 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dxjdc/crc-debug-dgh9b"] Nov 29 16:17:09 crc kubenswrapper[4907]: E1129 16:17:09.053920 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="951677d6-03ef-4152-9282-da3bee50c0c1" containerName="container-00" Nov 29 16:17:09 crc kubenswrapper[4907]: I1129 16:17:09.053936 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="951677d6-03ef-4152-9282-da3bee50c0c1" containerName="container-00" Nov 29 16:17:09 crc kubenswrapper[4907]: I1129 16:17:09.054250 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="951677d6-03ef-4152-9282-da3bee50c0c1" containerName="container-00" Nov 29 16:17:09 crc kubenswrapper[4907]: I1129 16:17:09.055190 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/crc-debug-dgh9b" Nov 29 16:17:09 crc kubenswrapper[4907]: I1129 16:17:09.159644 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vbz7\" (UniqueName: \"kubernetes.io/projected/20331517-b0d6-4a82-885b-f0be7edf1695-kube-api-access-9vbz7\") pod \"crc-debug-dgh9b\" (UID: \"20331517-b0d6-4a82-885b-f0be7edf1695\") " pod="openshift-must-gather-dxjdc/crc-debug-dgh9b" Nov 29 16:17:09 crc kubenswrapper[4907]: I1129 16:17:09.160175 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20331517-b0d6-4a82-885b-f0be7edf1695-host\") pod \"crc-debug-dgh9b\" (UID: \"20331517-b0d6-4a82-885b-f0be7edf1695\") " pod="openshift-must-gather-dxjdc/crc-debug-dgh9b" Nov 29 16:17:09 crc kubenswrapper[4907]: I1129 16:17:09.262297 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20331517-b0d6-4a82-885b-f0be7edf1695-host\") pod \"crc-debug-dgh9b\" (UID: \"20331517-b0d6-4a82-885b-f0be7edf1695\") " pod="openshift-must-gather-dxjdc/crc-debug-dgh9b" Nov 29 16:17:09 crc kubenswrapper[4907]: I1129 16:17:09.262511 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vbz7\" (UniqueName: \"kubernetes.io/projected/20331517-b0d6-4a82-885b-f0be7edf1695-kube-api-access-9vbz7\") pod \"crc-debug-dgh9b\" (UID: \"20331517-b0d6-4a82-885b-f0be7edf1695\") " pod="openshift-must-gather-dxjdc/crc-debug-dgh9b" Nov 29 16:17:09 crc kubenswrapper[4907]: I1129 16:17:09.262521 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20331517-b0d6-4a82-885b-f0be7edf1695-host\") pod \"crc-debug-dgh9b\" (UID: \"20331517-b0d6-4a82-885b-f0be7edf1695\") " pod="openshift-must-gather-dxjdc/crc-debug-dgh9b" Nov 29 16:17:09 crc kubenswrapper[4907]: I1129 16:17:09.285295 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vbz7\" (UniqueName: \"kubernetes.io/projected/20331517-b0d6-4a82-885b-f0be7edf1695-kube-api-access-9vbz7\") pod \"crc-debug-dgh9b\" (UID: \"20331517-b0d6-4a82-885b-f0be7edf1695\") " pod="openshift-must-gather-dxjdc/crc-debug-dgh9b" Nov 29 16:17:09 crc kubenswrapper[4907]: I1129 16:17:09.384916 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/crc-debug-dgh9b" Nov 29 16:17:09 crc kubenswrapper[4907]: I1129 16:17:09.614698 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxjdc/crc-debug-dgh9b" event={"ID":"20331517-b0d6-4a82-885b-f0be7edf1695","Type":"ContainerStarted","Data":"35bc899002d586a0ac63386ebfc8b8eff9f08f3bcfde5a0d5e443629e599d6a3"} Nov 29 16:17:10 crc kubenswrapper[4907]: I1129 16:17:10.624264 4907 generic.go:334] "Generic (PLEG): container finished" podID="20331517-b0d6-4a82-885b-f0be7edf1695" containerID="52d6c1f86f76aac6d5bb798e0ca1c62728cdc22dd537e50e92ab4c72d1f7d37b" exitCode=0 Nov 29 16:17:10 crc kubenswrapper[4907]: I1129 16:17:10.624657 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxjdc/crc-debug-dgh9b" event={"ID":"20331517-b0d6-4a82-885b-f0be7edf1695","Type":"ContainerDied","Data":"52d6c1f86f76aac6d5bb798e0ca1c62728cdc22dd537e50e92ab4c72d1f7d37b"} Nov 29 16:17:11 crc kubenswrapper[4907]: I1129 16:17:11.768278 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/crc-debug-dgh9b" Nov 29 16:17:11 crc kubenswrapper[4907]: I1129 16:17:11.921019 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20331517-b0d6-4a82-885b-f0be7edf1695-host\") pod \"20331517-b0d6-4a82-885b-f0be7edf1695\" (UID: \"20331517-b0d6-4a82-885b-f0be7edf1695\") " Nov 29 16:17:11 crc kubenswrapper[4907]: I1129 16:17:11.921707 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vbz7\" (UniqueName: \"kubernetes.io/projected/20331517-b0d6-4a82-885b-f0be7edf1695-kube-api-access-9vbz7\") pod \"20331517-b0d6-4a82-885b-f0be7edf1695\" (UID: \"20331517-b0d6-4a82-885b-f0be7edf1695\") " Nov 29 16:17:11 crc kubenswrapper[4907]: I1129 16:17:11.921139 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20331517-b0d6-4a82-885b-f0be7edf1695-host" (OuterVolumeSpecName: "host") pod "20331517-b0d6-4a82-885b-f0be7edf1695" (UID: "20331517-b0d6-4a82-885b-f0be7edf1695"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 16:17:11 crc kubenswrapper[4907]: I1129 16:17:11.922631 4907 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20331517-b0d6-4a82-885b-f0be7edf1695-host\") on node \"crc\" DevicePath \"\"" Nov 29 16:17:11 crc kubenswrapper[4907]: I1129 16:17:11.929152 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20331517-b0d6-4a82-885b-f0be7edf1695-kube-api-access-9vbz7" (OuterVolumeSpecName: "kube-api-access-9vbz7") pod "20331517-b0d6-4a82-885b-f0be7edf1695" (UID: "20331517-b0d6-4a82-885b-f0be7edf1695"). InnerVolumeSpecName "kube-api-access-9vbz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:17:12 crc kubenswrapper[4907]: I1129 16:17:12.024188 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vbz7\" (UniqueName: \"kubernetes.io/projected/20331517-b0d6-4a82-885b-f0be7edf1695-kube-api-access-9vbz7\") on node \"crc\" DevicePath \"\"" Nov 29 16:17:12 crc kubenswrapper[4907]: I1129 16:17:12.658472 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxjdc/crc-debug-dgh9b" event={"ID":"20331517-b0d6-4a82-885b-f0be7edf1695","Type":"ContainerDied","Data":"35bc899002d586a0ac63386ebfc8b8eff9f08f3bcfde5a0d5e443629e599d6a3"} Nov 29 16:17:12 crc kubenswrapper[4907]: I1129 16:17:12.658520 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35bc899002d586a0ac63386ebfc8b8eff9f08f3bcfde5a0d5e443629e599d6a3" Nov 29 16:17:12 crc kubenswrapper[4907]: I1129 16:17:12.658529 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/crc-debug-dgh9b" Nov 29 16:17:13 crc kubenswrapper[4907]: I1129 16:17:13.134617 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dxjdc/crc-debug-dgh9b"] Nov 29 16:17:13 crc kubenswrapper[4907]: I1129 16:17:13.145156 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dxjdc/crc-debug-dgh9b"] Nov 29 16:17:14 crc kubenswrapper[4907]: I1129 16:17:14.354248 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dxjdc/crc-debug-6lz2f"] Nov 29 16:17:14 crc kubenswrapper[4907]: E1129 16:17:14.355288 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20331517-b0d6-4a82-885b-f0be7edf1695" containerName="container-00" Nov 29 16:17:14 crc kubenswrapper[4907]: I1129 16:17:14.355308 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="20331517-b0d6-4a82-885b-f0be7edf1695" containerName="container-00" Nov 29 16:17:14 crc kubenswrapper[4907]: I1129 16:17:14.355626 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="20331517-b0d6-4a82-885b-f0be7edf1695" containerName="container-00" Nov 29 16:17:14 crc kubenswrapper[4907]: I1129 16:17:14.356607 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/crc-debug-6lz2f" Nov 29 16:17:14 crc kubenswrapper[4907]: I1129 16:17:14.437689 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj8lg\" (UniqueName: \"kubernetes.io/projected/2119e079-794f-479c-a6bc-d0b13b0eb40f-kube-api-access-qj8lg\") pod \"crc-debug-6lz2f\" (UID: \"2119e079-794f-479c-a6bc-d0b13b0eb40f\") " pod="openshift-must-gather-dxjdc/crc-debug-6lz2f" Nov 29 16:17:14 crc kubenswrapper[4907]: I1129 16:17:14.438087 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2119e079-794f-479c-a6bc-d0b13b0eb40f-host\") pod \"crc-debug-6lz2f\" (UID: \"2119e079-794f-479c-a6bc-d0b13b0eb40f\") " pod="openshift-must-gather-dxjdc/crc-debug-6lz2f" Nov 29 16:17:14 crc kubenswrapper[4907]: I1129 16:17:14.503396 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20331517-b0d6-4a82-885b-f0be7edf1695" path="/var/lib/kubelet/pods/20331517-b0d6-4a82-885b-f0be7edf1695/volumes" Nov 29 16:17:14 crc kubenswrapper[4907]: I1129 16:17:14.541963 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj8lg\" (UniqueName: \"kubernetes.io/projected/2119e079-794f-479c-a6bc-d0b13b0eb40f-kube-api-access-qj8lg\") pod \"crc-debug-6lz2f\" (UID: \"2119e079-794f-479c-a6bc-d0b13b0eb40f\") " pod="openshift-must-gather-dxjdc/crc-debug-6lz2f" Nov 29 16:17:14 crc kubenswrapper[4907]: I1129 16:17:14.542120 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2119e079-794f-479c-a6bc-d0b13b0eb40f-host\") pod \"crc-debug-6lz2f\" (UID: \"2119e079-794f-479c-a6bc-d0b13b0eb40f\") " pod="openshift-must-gather-dxjdc/crc-debug-6lz2f" Nov 29 16:17:14 crc kubenswrapper[4907]: I1129 16:17:14.542835 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2119e079-794f-479c-a6bc-d0b13b0eb40f-host\") pod \"crc-debug-6lz2f\" (UID: \"2119e079-794f-479c-a6bc-d0b13b0eb40f\") " pod="openshift-must-gather-dxjdc/crc-debug-6lz2f" Nov 29 16:17:14 crc kubenswrapper[4907]: I1129 16:17:14.572481 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj8lg\" (UniqueName: \"kubernetes.io/projected/2119e079-794f-479c-a6bc-d0b13b0eb40f-kube-api-access-qj8lg\") pod \"crc-debug-6lz2f\" (UID: \"2119e079-794f-479c-a6bc-d0b13b0eb40f\") " pod="openshift-must-gather-dxjdc/crc-debug-6lz2f" Nov 29 16:17:14 crc kubenswrapper[4907]: I1129 16:17:14.676274 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/crc-debug-6lz2f" Nov 29 16:17:15 crc kubenswrapper[4907]: I1129 16:17:15.714880 4907 generic.go:334] "Generic (PLEG): container finished" podID="2119e079-794f-479c-a6bc-d0b13b0eb40f" containerID="023bd37eca6b79428f9387f1937a69f2af2a0796a3a747b44c984ffb01a1d4cc" exitCode=0 Nov 29 16:17:15 crc kubenswrapper[4907]: I1129 16:17:15.714968 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxjdc/crc-debug-6lz2f" event={"ID":"2119e079-794f-479c-a6bc-d0b13b0eb40f","Type":"ContainerDied","Data":"023bd37eca6b79428f9387f1937a69f2af2a0796a3a747b44c984ffb01a1d4cc"} Nov 29 16:17:15 crc kubenswrapper[4907]: I1129 16:17:15.715546 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxjdc/crc-debug-6lz2f" event={"ID":"2119e079-794f-479c-a6bc-d0b13b0eb40f","Type":"ContainerStarted","Data":"1998be74e0cfc09ff75e215545744f4a229c4d3fae001054360415a64eaa63ec"} Nov 29 16:17:15 crc kubenswrapper[4907]: I1129 16:17:15.775301 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dxjdc/crc-debug-6lz2f"] Nov 29 16:17:15 crc kubenswrapper[4907]: I1129 16:17:15.791286 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dxjdc/crc-debug-6lz2f"] Nov 29 16:17:16 crc kubenswrapper[4907]: I1129 16:17:16.879840 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/crc-debug-6lz2f" Nov 29 16:17:17 crc kubenswrapper[4907]: I1129 16:17:17.003793 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2119e079-794f-479c-a6bc-d0b13b0eb40f-host\") pod \"2119e079-794f-479c-a6bc-d0b13b0eb40f\" (UID: \"2119e079-794f-479c-a6bc-d0b13b0eb40f\") " Nov 29 16:17:17 crc kubenswrapper[4907]: I1129 16:17:17.003941 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2119e079-794f-479c-a6bc-d0b13b0eb40f-host" (OuterVolumeSpecName: "host") pod "2119e079-794f-479c-a6bc-d0b13b0eb40f" (UID: "2119e079-794f-479c-a6bc-d0b13b0eb40f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 16:17:17 crc kubenswrapper[4907]: I1129 16:17:17.003961 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj8lg\" (UniqueName: \"kubernetes.io/projected/2119e079-794f-479c-a6bc-d0b13b0eb40f-kube-api-access-qj8lg\") pod \"2119e079-794f-479c-a6bc-d0b13b0eb40f\" (UID: \"2119e079-794f-479c-a6bc-d0b13b0eb40f\") " Nov 29 16:17:17 crc kubenswrapper[4907]: I1129 16:17:17.004656 4907 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2119e079-794f-479c-a6bc-d0b13b0eb40f-host\") on node \"crc\" DevicePath \"\"" Nov 29 16:17:17 crc kubenswrapper[4907]: I1129 16:17:17.010720 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2119e079-794f-479c-a6bc-d0b13b0eb40f-kube-api-access-qj8lg" (OuterVolumeSpecName: "kube-api-access-qj8lg") pod "2119e079-794f-479c-a6bc-d0b13b0eb40f" (UID: "2119e079-794f-479c-a6bc-d0b13b0eb40f"). InnerVolumeSpecName "kube-api-access-qj8lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:17:17 crc kubenswrapper[4907]: I1129 16:17:17.110277 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj8lg\" (UniqueName: \"kubernetes.io/projected/2119e079-794f-479c-a6bc-d0b13b0eb40f-kube-api-access-qj8lg\") on node \"crc\" DevicePath \"\"" Nov 29 16:17:17 crc kubenswrapper[4907]: I1129 16:17:17.480275 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:17:17 crc kubenswrapper[4907]: E1129 16:17:17.481183 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:17:17 crc kubenswrapper[4907]: I1129 16:17:17.737057 4907 scope.go:117] "RemoveContainer" containerID="023bd37eca6b79428f9387f1937a69f2af2a0796a3a747b44c984ffb01a1d4cc" Nov 29 16:17:17 crc kubenswrapper[4907]: I1129 16:17:17.737306 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/crc-debug-6lz2f" Nov 29 16:17:18 crc kubenswrapper[4907]: I1129 16:17:18.491734 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2119e079-794f-479c-a6bc-d0b13b0eb40f" path="/var/lib/kubelet/pods/2119e079-794f-479c-a6bc-d0b13b0eb40f/volumes" Nov 29 16:17:32 crc kubenswrapper[4907]: I1129 16:17:32.500993 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:17:32 crc kubenswrapper[4907]: E1129 16:17:32.501648 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:17:46 crc kubenswrapper[4907]: I1129 16:17:46.480161 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:17:46 crc kubenswrapper[4907]: E1129 16:17:46.481105 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:17:52 crc kubenswrapper[4907]: I1129 16:17:52.443456 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ceb3061f-2f32-4e89-98f0-628f316bef79/aodh-api/0.log" Nov 29 16:17:52 crc kubenswrapper[4907]: I1129 16:17:52.522434 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ceb3061f-2f32-4e89-98f0-628f316bef79/aodh-evaluator/0.log" Nov 29 16:17:52 crc kubenswrapper[4907]: I1129 16:17:52.619569 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ceb3061f-2f32-4e89-98f0-628f316bef79/aodh-listener/0.log" Nov 29 16:17:52 crc kubenswrapper[4907]: I1129 16:17:52.716867 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_ceb3061f-2f32-4e89-98f0-628f316bef79/aodh-notifier/0.log" Nov 29 16:17:52 crc kubenswrapper[4907]: I1129 16:17:52.825252 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c879c8666-gfj6k_d00a5123-088f-4681-81b0-89706e0cb7a8/barbican-api/0.log" Nov 29 16:17:52 crc kubenswrapper[4907]: I1129 16:17:52.871929 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c879c8666-gfj6k_d00a5123-088f-4681-81b0-89706e0cb7a8/barbican-api-log/0.log" Nov 29 16:17:52 crc kubenswrapper[4907]: I1129 16:17:52.972385 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64d7c8d644-rz2mz_f8bb23b2-9a00-4098-b349-ac5221a0d305/barbican-keystone-listener/0.log" Nov 29 16:17:53 crc kubenswrapper[4907]: I1129 16:17:53.190880 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64d7c8d644-rz2mz_f8bb23b2-9a00-4098-b349-ac5221a0d305/barbican-keystone-listener-log/0.log" Nov 29 16:17:53 crc kubenswrapper[4907]: I1129 16:17:53.232145 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-bbc8f7595-4cqhq_27887b0e-b017-4255-a5db-817cc7142898/barbican-worker/0.log" Nov 29 16:17:53 crc kubenswrapper[4907]: I1129 16:17:53.271363 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-bbc8f7595-4cqhq_27887b0e-b017-4255-a5db-817cc7142898/barbican-worker-log/0.log" Nov 29 16:17:53 crc kubenswrapper[4907]: I1129 16:17:53.496203 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-wl4lz_527f58ea-f7a3-43c4-aeee-b22f40560466/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:17:53 crc kubenswrapper[4907]: I1129 16:17:53.647194 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a/ceilometer-central-agent/0.log" Nov 29 16:17:53 crc kubenswrapper[4907]: I1129 16:17:53.722308 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a/sg-core/0.log" Nov 29 16:17:53 crc kubenswrapper[4907]: I1129 16:17:53.747012 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a/proxy-httpd/0.log" Nov 29 16:17:53 crc kubenswrapper[4907]: I1129 16:17:53.851572 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6bc944-b7ab-43b8-82eb-7c46a7a5e77a/ceilometer-notification-agent/0.log" Nov 29 16:17:53 crc kubenswrapper[4907]: I1129 16:17:53.973552 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b7255a0d-394e-4d14-bc92-327e101b6ed3/cinder-api-log/0.log" Nov 29 16:17:54 crc kubenswrapper[4907]: I1129 16:17:54.016115 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b7255a0d-394e-4d14-bc92-327e101b6ed3/cinder-api/0.log" Nov 29 16:17:54 crc kubenswrapper[4907]: I1129 16:17:54.174620 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dc42ffed-7148-4260-82d7-0b4a2fecc830/cinder-scheduler/0.log" Nov 29 16:17:54 crc kubenswrapper[4907]: I1129 16:17:54.297645 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dc42ffed-7148-4260-82d7-0b4a2fecc830/probe/0.log" Nov 29 16:17:54 crc kubenswrapper[4907]: I1129 16:17:54.356506 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-p8bwk_1266709f-ede6-4b61-b733-c40852501bb6/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:17:54 crc kubenswrapper[4907]: I1129 16:17:54.585951 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-p2mnt_711bdb9b-2232-4b77-83b1-8501049d68cc/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:17:54 crc kubenswrapper[4907]: I1129 16:17:54.627744 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-7bd8m_a05a8a5d-2682-419c-abb4-3b4bb8920a68/init/0.log" Nov 29 16:17:54 crc kubenswrapper[4907]: I1129 16:17:54.774355 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-7bd8m_a05a8a5d-2682-419c-abb4-3b4bb8920a68/init/0.log" Nov 29 16:17:54 crc kubenswrapper[4907]: I1129 16:17:54.908746 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-bw4ww_c36b07ee-345f-4815-8cb9-25085e925d6a/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:17:54 crc kubenswrapper[4907]: I1129 16:17:54.909283 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-7bd8m_a05a8a5d-2682-419c-abb4-3b4bb8920a68/dnsmasq-dns/0.log" Nov 29 16:17:55 crc kubenswrapper[4907]: I1129 16:17:55.443740 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_163933a3-d98e-4701-9124-c821395572eb/glance-log/0.log" Nov 29 16:17:55 crc kubenswrapper[4907]: I1129 16:17:55.471186 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_163933a3-d98e-4701-9124-c821395572eb/glance-httpd/0.log" Nov 29 16:17:55 crc kubenswrapper[4907]: I1129 16:17:55.640119 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0759e100-595c-4f28-8934-25f0a3bb9010/glance-httpd/0.log" Nov 29 16:17:55 crc kubenswrapper[4907]: I1129 16:17:55.670793 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0759e100-595c-4f28-8934-25f0a3bb9010/glance-log/0.log" Nov 29 16:17:56 crc kubenswrapper[4907]: I1129 16:17:56.253903 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-57dd6cc64-twm82_5d44a8a1-d845-407d-8769-8c0ccbebc4d2/heat-engine/0.log" Nov 29 16:17:56 crc kubenswrapper[4907]: I1129 16:17:56.336269 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5bcf4f8684-6877z_f688c14e-91bc-4525-acd8-a0c9d440dff4/heat-api/0.log" Nov 29 16:17:56 crc kubenswrapper[4907]: I1129 16:17:56.434073 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7b4cb5586d-96kjn_ac083b7c-e604-4aa0-98ed-66668134ad44/heat-cfnapi/0.log" Nov 29 16:17:56 crc kubenswrapper[4907]: I1129 16:17:56.712049 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-l64sn_720a0e4a-f20d-401e-9c04-fd8c001281c3/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:17:56 crc kubenswrapper[4907]: I1129 16:17:56.825184 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-7b9cj_79c3b8e5-9000-49cd-a62c-ae366a7592b0/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:17:57 crc kubenswrapper[4907]: I1129 16:17:57.040795 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29407141-rgrbj_55d5a03b-62e0-412d-97d5-99a260862255/keystone-cron/0.log" Nov 29 16:17:57 crc kubenswrapper[4907]: I1129 16:17:57.182063 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29407201-vr9k7_23f66094-97ee-4ca1-9f7c-d435aabea4af/keystone-cron/0.log" Nov 29 16:17:57 crc kubenswrapper[4907]: I1129 16:17:57.327881 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76d8ccf675-75wqf_9176edef-f683-4a54-a9b0-3ff55a80347b/keystone-api/0.log" Nov 29 16:17:57 crc kubenswrapper[4907]: I1129 16:17:57.338387 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e119dfa1-0a93-4e7a-9b97-8530dbde1fbc/kube-state-metrics/0.log" Nov 29 16:17:57 crc kubenswrapper[4907]: I1129 16:17:57.383342 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-fprns_49802986-4e60-418a-9c0b-5263ebef0944/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:17:57 crc kubenswrapper[4907]: I1129 16:17:57.479145 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:17:57 crc kubenswrapper[4907]: E1129 16:17:57.479613 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:17:57 crc kubenswrapper[4907]: I1129 16:17:57.519644 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-tmrrg_f6ad9998-0d0c-456f-ba11-a0fd8e07b8d5/logging-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:17:57 crc kubenswrapper[4907]: I1129 16:17:57.785462 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_406365ac-b529-4ca8-be52-8b802da87feb/mysqld-exporter/0.log" Nov 29 16:17:58 crc kubenswrapper[4907]: I1129 16:17:58.093384 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-68c5f6d545-tlmv5_e814e290-11f9-48bc-9f3d-36aeecf0ec1a/neutron-httpd/0.log" Nov 29 16:17:58 crc kubenswrapper[4907]: I1129 16:17:58.133500 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-q45vr_dd16d4b2-3a9d-4a05-8564-3de313928ab8/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:17:58 crc kubenswrapper[4907]: I1129 16:17:58.159561 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-68c5f6d545-tlmv5_e814e290-11f9-48bc-9f3d-36aeecf0ec1a/neutron-api/0.log" Nov 29 16:17:58 crc kubenswrapper[4907]: I1129 16:17:58.864307 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a21e4e8e-a729-4641-9c99-c022eb3ca6a8/nova-cell0-conductor-conductor/0.log" Nov 29 16:17:59 crc kubenswrapper[4907]: I1129 16:17:59.157159 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3bf63a4b-e850-4b1c-b7ea-00bf87d3d125/nova-api-log/0.log" Nov 29 16:17:59 crc kubenswrapper[4907]: I1129 16:17:59.236092 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3d959bbb-e174-4315-935c-18f5cc65008c/nova-cell1-conductor-conductor/0.log" Nov 29 16:17:59 crc kubenswrapper[4907]: I1129 16:17:59.584332 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-z8mj5_9af53af7-2ded-4b44-92c8-85cb98ea6519/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:17:59 crc kubenswrapper[4907]: I1129 16:17:59.659125 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9a20c0df-80ec-4e00-9bec-409327ec2c90/nova-cell1-novncproxy-novncproxy/0.log" Nov 29 16:17:59 crc kubenswrapper[4907]: I1129 16:17:59.809028 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3bf63a4b-e850-4b1c-b7ea-00bf87d3d125/nova-api-api/0.log" Nov 29 16:17:59 crc kubenswrapper[4907]: I1129 16:17:59.988763 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8e651a72-97da-438c-9791-42506da10f6f/nova-metadata-log/0.log" Nov 29 16:18:00 crc kubenswrapper[4907]: I1129 16:18:00.282614 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e9f58cda-06bb-41f5-b91d-fdf10dab6164/mysql-bootstrap/0.log" Nov 29 16:18:00 crc kubenswrapper[4907]: I1129 16:18:00.355968 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ad18f82a-09a8-4a8c-ae4b-677fa5dd280d/nova-scheduler-scheduler/0.log" Nov 29 16:18:00 crc kubenswrapper[4907]: I1129 16:18:00.480419 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e9f58cda-06bb-41f5-b91d-fdf10dab6164/mysql-bootstrap/0.log" Nov 29 16:18:00 crc kubenswrapper[4907]: I1129 16:18:00.516640 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e9f58cda-06bb-41f5-b91d-fdf10dab6164/galera/0.log" Nov 29 16:18:00 crc kubenswrapper[4907]: I1129 16:18:00.735506 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2b4712d7-a81b-455f-841a-a0ca14eafcbe/mysql-bootstrap/0.log" Nov 29 16:18:00 crc kubenswrapper[4907]: I1129 16:18:00.900453 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2b4712d7-a81b-455f-841a-a0ca14eafcbe/mysql-bootstrap/0.log" Nov 29 16:18:00 crc kubenswrapper[4907]: I1129 16:18:00.947280 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2b4712d7-a81b-455f-841a-a0ca14eafcbe/galera/0.log" Nov 29 16:18:01 crc kubenswrapper[4907]: I1129 16:18:01.133051 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_71aeb8b9-6bde-4a3e-a6f1-6d7c192490be/openstackclient/0.log" Nov 29 16:18:01 crc kubenswrapper[4907]: I1129 16:18:01.166567 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-m4mz2_33f5965b-43ae-484d-9c5c-1a54ae4de6da/ovn-controller/0.log" Nov 29 16:18:01 crc kubenswrapper[4907]: I1129 16:18:01.385492 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-225jx_79e24485-836a-4a5d-a183-2f8dc0de5c07/openstack-network-exporter/0.log" Nov 29 16:18:01 crc kubenswrapper[4907]: I1129 16:18:01.603068 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ndrfl_b3d93208-e155-4746-bfd4-2d6d7d04dc2e/ovsdb-server-init/0.log" Nov 29 16:18:01 crc kubenswrapper[4907]: I1129 16:18:01.866116 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ndrfl_b3d93208-e155-4746-bfd4-2d6d7d04dc2e/ovsdb-server/0.log" Nov 29 16:18:01 crc kubenswrapper[4907]: I1129 16:18:01.883402 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ndrfl_b3d93208-e155-4746-bfd4-2d6d7d04dc2e/ovs-vswitchd/0.log" Nov 29 16:18:01 crc kubenswrapper[4907]: I1129 16:18:01.910628 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ndrfl_b3d93208-e155-4746-bfd4-2d6d7d04dc2e/ovsdb-server-init/0.log" Nov 29 16:18:02 crc kubenswrapper[4907]: I1129 16:18:02.180774 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-88bdf_a0dbf497-7f0c-4aaf-841d-7abbe8299bd9/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:18:02 crc kubenswrapper[4907]: I1129 16:18:02.290262 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fc8b80af-94d6-4c43-887b-07aafa877200/openstack-network-exporter/0.log" Nov 29 16:18:02 crc kubenswrapper[4907]: I1129 16:18:02.318658 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8e651a72-97da-438c-9791-42506da10f6f/nova-metadata-metadata/0.log" Nov 29 16:18:02 crc kubenswrapper[4907]: I1129 16:18:02.407463 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fc8b80af-94d6-4c43-887b-07aafa877200/ovn-northd/0.log" Nov 29 16:18:02 crc kubenswrapper[4907]: I1129 16:18:02.516699 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_76797dae-1bc6-4e63-824b-423fab640187/openstack-network-exporter/0.log" Nov 29 16:18:02 crc kubenswrapper[4907]: I1129 16:18:02.619808 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_76797dae-1bc6-4e63-824b-423fab640187/ovsdbserver-nb/0.log" Nov 29 16:18:02 crc kubenswrapper[4907]: I1129 16:18:02.729535 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7c6489ed-0658-49ec-8ae2-a43a8cf795ef/openstack-network-exporter/0.log" Nov 29 16:18:03 crc kubenswrapper[4907]: I1129 16:18:03.014334 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7c6489ed-0658-49ec-8ae2-a43a8cf795ef/ovsdbserver-sb/0.log" Nov 29 16:18:03 crc kubenswrapper[4907]: I1129 16:18:03.347011 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b794f06f-38a0-4c4d-933b-db50f05ddfb8/init-config-reloader/0.log" Nov 29 16:18:03 crc kubenswrapper[4907]: I1129 16:18:03.387591 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65859db6b4-hwsds_d47b748c-ba00-496f-83d0-45aaa1049423/placement-log/0.log" Nov 29 16:18:03 crc kubenswrapper[4907]: I1129 16:18:03.399751 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65859db6b4-hwsds_d47b748c-ba00-496f-83d0-45aaa1049423/placement-api/0.log" Nov 29 16:18:03 crc kubenswrapper[4907]: I1129 16:18:03.579753 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b794f06f-38a0-4c4d-933b-db50f05ddfb8/config-reloader/0.log" Nov 29 16:18:03 crc kubenswrapper[4907]: I1129 16:18:03.592016 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b794f06f-38a0-4c4d-933b-db50f05ddfb8/thanos-sidecar/0.log" Nov 29 16:18:03 crc kubenswrapper[4907]: I1129 16:18:03.606104 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b794f06f-38a0-4c4d-933b-db50f05ddfb8/init-config-reloader/0.log" Nov 29 16:18:03 crc kubenswrapper[4907]: I1129 16:18:03.639296 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b794f06f-38a0-4c4d-933b-db50f05ddfb8/prometheus/0.log" Nov 29 16:18:03 crc kubenswrapper[4907]: I1129 16:18:03.844649 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_07559e4d-3526-441a-a08d-e11c60e80761/setup-container/0.log" Nov 29 16:18:04 crc kubenswrapper[4907]: I1129 16:18:04.053563 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_07559e4d-3526-441a-a08d-e11c60e80761/setup-container/0.log" Nov 29 16:18:04 crc kubenswrapper[4907]: I1129 16:18:04.095120 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_07559e4d-3526-441a-a08d-e11c60e80761/rabbitmq/0.log" Nov 29 16:18:04 crc kubenswrapper[4907]: I1129 16:18:04.186511 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63f606f9-1313-4d39-8f54-78078cbd256e/setup-container/0.log" Nov 29 16:18:04 crc kubenswrapper[4907]: I1129 16:18:04.356066 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63f606f9-1313-4d39-8f54-78078cbd256e/setup-container/0.log" Nov 29 16:18:04 crc kubenswrapper[4907]: I1129 16:18:04.425406 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63f606f9-1313-4d39-8f54-78078cbd256e/rabbitmq/0.log" Nov 29 16:18:04 crc kubenswrapper[4907]: I1129 16:18:04.441051 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-wrbcd_ad0ddb73-0774-41ea-999b-a915a2d0f5cd/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:18:04 crc kubenswrapper[4907]: I1129 16:18:04.703961 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-6zdwj_6af6aa2c-2dee-441f-8607-cd1aec4d6fc3/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:18:04 crc kubenswrapper[4907]: I1129 16:18:04.712532 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-vsb6p_36c11dee-b610-4a94-937c-63b049c54f14/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:18:05 crc kubenswrapper[4907]: I1129 16:18:05.004648 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-d9sbv_17f0e654-fdc3-4250-9e2e-bf7cb21e7175/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:18:05 crc kubenswrapper[4907]: I1129 16:18:05.080890 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-5tzc9_d3aef26e-6dd9-447d-b445-09b8c9b80935/ssh-known-hosts-edpm-deployment/0.log" Nov 29 16:18:05 crc kubenswrapper[4907]: I1129 16:18:05.292816 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6c8fc64d77-lnt4r_89877a72-fedb-44ba-abe3-f74344119594/proxy-server/0.log" Nov 29 16:18:05 crc kubenswrapper[4907]: I1129 16:18:05.454806 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6c8fc64d77-lnt4r_89877a72-fedb-44ba-abe3-f74344119594/proxy-httpd/0.log" Nov 29 16:18:05 crc kubenswrapper[4907]: I1129 16:18:05.510012 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-nfb2t_b5bad7a6-9301-4f9e-8303-ae377c4f909f/swift-ring-rebalance/0.log" Nov 29 16:18:05 crc kubenswrapper[4907]: I1129 16:18:05.646899 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/account-auditor/0.log" Nov 29 16:18:05 crc kubenswrapper[4907]: I1129 16:18:05.819952 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_828d6b05-06be-4157-8163-96a3220fedb0/memcached/0.log" Nov 29 16:18:05 crc kubenswrapper[4907]: I1129 16:18:05.827631 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/account-reaper/0.log" Nov 29 16:18:05 crc kubenswrapper[4907]: I1129 16:18:05.896620 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/account-replicator/0.log" Nov 29 16:18:05 crc kubenswrapper[4907]: I1129 16:18:05.995974 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/account-server/0.log" Nov 29 16:18:06 crc kubenswrapper[4907]: I1129 16:18:06.087463 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/container-server/0.log" Nov 29 16:18:06 crc kubenswrapper[4907]: I1129 16:18:06.096257 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/container-updater/0.log" Nov 29 16:18:06 crc kubenswrapper[4907]: I1129 16:18:06.120407 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/container-auditor/0.log" Nov 29 16:18:06 crc kubenswrapper[4907]: I1129 16:18:06.127154 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/container-replicator/0.log" Nov 29 16:18:06 crc kubenswrapper[4907]: I1129 16:18:06.225812 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/object-auditor/0.log" Nov 29 16:18:06 crc kubenswrapper[4907]: I1129 16:18:06.278227 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/object-replicator/0.log" Nov 29 16:18:06 crc kubenswrapper[4907]: I1129 16:18:06.293502 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/object-expirer/0.log" Nov 29 16:18:06 crc kubenswrapper[4907]: I1129 16:18:06.299416 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/object-updater/0.log" Nov 29 16:18:06 crc kubenswrapper[4907]: I1129 16:18:06.347531 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/object-server/0.log" Nov 29 16:18:06 crc kubenswrapper[4907]: I1129 16:18:06.410307 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/rsync/0.log" Nov 29 16:18:06 crc kubenswrapper[4907]: I1129 16:18:06.486175 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe027ad6-8a24-44b5-8bfb-732d5c8fe22a/swift-recon-cron/0.log" Nov 29 16:18:06 crc kubenswrapper[4907]: I1129 16:18:06.536221 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-qfdp4_459ecb90-0260-49a5-a146-fb948f9daefb/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:18:06 crc kubenswrapper[4907]: I1129 16:18:06.696070 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-m7c4b_3ce1e052-0764-4391-9bb8-149e06b8744a/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:18:06 crc kubenswrapper[4907]: I1129 16:18:06.801047 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_520ecac5-d7f0-4863-8d88-4789fcad831a/test-operator-logs-container/0.log" Nov 29 16:18:06 crc kubenswrapper[4907]: I1129 16:18:06.895367 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-d76lm_5661a247-cd8b-4001-bbf9-841c52c59abc/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 29 16:18:07 crc kubenswrapper[4907]: I1129 16:18:07.288468 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e689bb5a-7b28-48c6-995f-bc0dc07078de/tempest-tests-tempest-tests-runner/0.log" Nov 29 16:18:08 crc kubenswrapper[4907]: I1129 16:18:08.479267 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:18:08 crc kubenswrapper[4907]: E1129 16:18:08.479804 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:18:22 crc kubenswrapper[4907]: I1129 16:18:22.487585 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:18:22 crc kubenswrapper[4907]: E1129 16:18:22.488214 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:18:33 crc kubenswrapper[4907]: I1129 16:18:33.352116 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s_089ab608-2dbf-489d-bcc8-cb61ab4564b4/util/0.log" Nov 29 16:18:33 crc kubenswrapper[4907]: I1129 16:18:33.583537 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s_089ab608-2dbf-489d-bcc8-cb61ab4564b4/util/0.log" Nov 29 16:18:33 crc kubenswrapper[4907]: I1129 16:18:33.597724 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s_089ab608-2dbf-489d-bcc8-cb61ab4564b4/pull/0.log" Nov 29 16:18:33 crc kubenswrapper[4907]: I1129 16:18:33.634074 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s_089ab608-2dbf-489d-bcc8-cb61ab4564b4/pull/0.log" Nov 29 16:18:33 crc kubenswrapper[4907]: I1129 16:18:33.776713 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s_089ab608-2dbf-489d-bcc8-cb61ab4564b4/util/0.log" Nov 29 16:18:33 crc kubenswrapper[4907]: I1129 16:18:33.791026 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s_089ab608-2dbf-489d-bcc8-cb61ab4564b4/pull/0.log" Nov 29 16:18:33 crc kubenswrapper[4907]: I1129 16:18:33.804718 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fdpmc9s_089ab608-2dbf-489d-bcc8-cb61ab4564b4/extract/0.log" Nov 29 16:18:33 crc kubenswrapper[4907]: I1129 16:18:33.974123 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-b8f6h_958f375f-e7a8-4d96-b2a1-dc5a63cdc865/kube-rbac-proxy/0.log" Nov 29 16:18:34 crc kubenswrapper[4907]: I1129 16:18:34.025108 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-b8f6h_958f375f-e7a8-4d96-b2a1-dc5a63cdc865/manager/0.log" Nov 29 16:18:34 crc kubenswrapper[4907]: I1129 16:18:34.056192 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-kqbx2_71e0b5bc-68d6-434d-97c4-0c6d3a324e15/kube-rbac-proxy/0.log" Nov 29 16:18:34 crc kubenswrapper[4907]: I1129 16:18:34.174391 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-kqbx2_71e0b5bc-68d6-434d-97c4-0c6d3a324e15/manager/0.log" Nov 29 16:18:34 crc kubenswrapper[4907]: I1129 16:18:34.231385 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-brs2h_cdcaa6fe-2208-49d5-82d8-9b2c96be251d/manager/0.log" Nov 29 16:18:34 crc kubenswrapper[4907]: I1129 16:18:34.243262 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-brs2h_cdcaa6fe-2208-49d5-82d8-9b2c96be251d/kube-rbac-proxy/0.log" Nov 29 16:18:34 crc kubenswrapper[4907]: I1129 16:18:34.417145 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-28hrq_aca0ecce-183f-40cd-8ab0-aed5caf29556/kube-rbac-proxy/0.log" Nov 29 16:18:34 crc kubenswrapper[4907]: I1129 16:18:34.536004 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-28hrq_aca0ecce-183f-40cd-8ab0-aed5caf29556/manager/0.log" Nov 29 16:18:34 crc kubenswrapper[4907]: I1129 16:18:34.608155 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-28vdp_bbc59bc4-78c2-4534-b1bd-93cf4b60f86e/kube-rbac-proxy/0.log" Nov 29 16:18:34 crc kubenswrapper[4907]: I1129 16:18:34.683413 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-28vdp_bbc59bc4-78c2-4534-b1bd-93cf4b60f86e/manager/0.log" Nov 29 16:18:34 crc kubenswrapper[4907]: I1129 16:18:34.752183 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-wspkj_cf7efbf1-79c8-45f9-8bed-7a33f47226ef/kube-rbac-proxy/0.log" Nov 29 16:18:34 crc kubenswrapper[4907]: I1129 16:18:34.818497 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-wspkj_cf7efbf1-79c8-45f9-8bed-7a33f47226ef/manager/0.log" Nov 29 16:18:34 crc kubenswrapper[4907]: I1129 16:18:34.926097 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-z92f7_f9c64e5e-531f-4f11-b7d5-e22ed46b9b86/kube-rbac-proxy/0.log" Nov 29 16:18:35 crc kubenswrapper[4907]: I1129 16:18:35.091791 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-z92f7_f9c64e5e-531f-4f11-b7d5-e22ed46b9b86/manager/0.log" Nov 29 16:18:35 crc kubenswrapper[4907]: I1129 16:18:35.149705 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-xqw5l_e12e8dfe-6b2d-49f4-90d1-3165ec08f043/kube-rbac-proxy/0.log" Nov 29 16:18:35 crc kubenswrapper[4907]: I1129 16:18:35.202259 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-xqw5l_e12e8dfe-6b2d-49f4-90d1-3165ec08f043/manager/0.log" Nov 29 16:18:35 crc kubenswrapper[4907]: I1129 16:18:35.274625 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-kqq5w_d9c5b591-4e0f-4f9e-930d-070798fccb44/kube-rbac-proxy/0.log" Nov 29 16:18:35 crc kubenswrapper[4907]: I1129 16:18:35.433394 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-kqq5w_d9c5b591-4e0f-4f9e-930d-070798fccb44/manager/0.log" Nov 29 16:18:35 crc kubenswrapper[4907]: I1129 16:18:35.489273 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-zspmk_46a74794-f3b0-4bf0-9c94-0920441fd3ce/kube-rbac-proxy/0.log" Nov 29 16:18:35 crc kubenswrapper[4907]: I1129 16:18:35.516121 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-zspmk_46a74794-f3b0-4bf0-9c94-0920441fd3ce/manager/0.log" Nov 29 16:18:35 crc kubenswrapper[4907]: I1129 16:18:35.650408 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-4dlgt_2b9ca9f5-7979-47ef-9e37-f9c519b57445/kube-rbac-proxy/0.log" Nov 29 16:18:35 crc kubenswrapper[4907]: I1129 16:18:35.691638 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-4dlgt_2b9ca9f5-7979-47ef-9e37-f9c519b57445/manager/0.log" Nov 29 16:18:35 crc kubenswrapper[4907]: I1129 16:18:35.816537 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-jm57w_103b9723-75c0-41ca-8264-41912d22a5cb/kube-rbac-proxy/0.log" Nov 29 16:18:35 crc kubenswrapper[4907]: I1129 16:18:35.891076 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-jm57w_103b9723-75c0-41ca-8264-41912d22a5cb/manager/0.log" Nov 29 16:18:35 crc kubenswrapper[4907]: I1129 16:18:35.935049 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-p8qdf_3cbe9b24-61e0-449f-a91f-289fd9c5de8e/kube-rbac-proxy/0.log" Nov 29 16:18:36 crc kubenswrapper[4907]: I1129 16:18:36.058415 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-p8qdf_3cbe9b24-61e0-449f-a91f-289fd9c5de8e/manager/0.log" Nov 29 16:18:36 crc kubenswrapper[4907]: I1129 16:18:36.114200 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-qhggl_8c2b8c08-4eb0-4a87-bb1f-87e08ff1aa91/kube-rbac-proxy/0.log" Nov 29 16:18:36 crc kubenswrapper[4907]: I1129 16:18:36.117483 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-qhggl_8c2b8c08-4eb0-4a87-bb1f-87e08ff1aa91/manager/0.log" Nov 29 16:18:36 crc kubenswrapper[4907]: I1129 16:18:36.265650 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd_711dd79a-4219-43a3-9767-aad244b9c68f/kube-rbac-proxy/0.log" Nov 29 16:18:36 crc kubenswrapper[4907]: I1129 16:18:36.342961 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4bgthd_711dd79a-4219-43a3-9767-aad244b9c68f/manager/0.log" Nov 29 16:18:36 crc kubenswrapper[4907]: I1129 16:18:36.668971 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-95b97cc44-vpslq_bd26c3e2-de76-4342-91c6-9ee4571f8619/operator/0.log" Nov 29 16:18:36 crc kubenswrapper[4907]: I1129 16:18:36.677018 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-2m2ns_ca80acab-472d-46fd-97c1-f432ddf7bb64/registry-server/0.log" Nov 29 16:18:36 crc kubenswrapper[4907]: I1129 16:18:36.827357 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-gs88x_7f331ad4-d753-41f4-82f9-c2bd60806987/kube-rbac-proxy/0.log" Nov 29 16:18:36 crc kubenswrapper[4907]: I1129 16:18:36.920057 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-gs88x_7f331ad4-d753-41f4-82f9-c2bd60806987/manager/0.log" Nov 29 16:18:37 crc kubenswrapper[4907]: I1129 16:18:37.014488 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-fdjx8_d193cf7e-774f-44b3-ae22-090d09c15ba5/kube-rbac-proxy/0.log" Nov 29 16:18:37 crc kubenswrapper[4907]: I1129 16:18:37.092408 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-fdjx8_d193cf7e-774f-44b3-ae22-090d09c15ba5/manager/0.log" Nov 29 16:18:37 crc kubenswrapper[4907]: I1129 16:18:37.169373 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-kd564_ec0d115a-0b4b-4691-b4f4-778ffd7f6219/operator/0.log" Nov 29 16:18:37 crc kubenswrapper[4907]: I1129 16:18:37.295531 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-h9tkn_fd726e7f-5139-4eb6-b18d-24d14682648c/manager/0.log" Nov 29 16:18:37 crc kubenswrapper[4907]: I1129 16:18:37.352485 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-h9tkn_fd726e7f-5139-4eb6-b18d-24d14682648c/kube-rbac-proxy/0.log" Nov 29 16:18:37 crc kubenswrapper[4907]: I1129 16:18:37.481027 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:18:37 crc kubenswrapper[4907]: E1129 16:18:37.481307 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:18:37 crc kubenswrapper[4907]: I1129 16:18:37.511540 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-86bbb9c7fb-ldhkh_dceea103-7394-4d01-9168-0b3f5b49306f/kube-rbac-proxy/0.log" Nov 29 16:18:37 crc kubenswrapper[4907]: I1129 16:18:37.600300 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-6h9pm_418d0e3c-7354-4e14-b17b-bab93518e78b/kube-rbac-proxy/0.log" Nov 29 16:18:37 crc kubenswrapper[4907]: I1129 16:18:37.731229 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-6h9pm_418d0e3c-7354-4e14-b17b-bab93518e78b/manager/0.log" Nov 29 16:18:37 crc kubenswrapper[4907]: I1129 16:18:37.828134 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-dpdcs_45db6747-0449-4839-b0ed-07e930579b83/kube-rbac-proxy/0.log" Nov 29 16:18:37 crc kubenswrapper[4907]: I1129 16:18:37.868703 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-f5c5f9868-d5gtz_e1f880b2-0e04-48c5-81ca-0103abd439fe/manager/0.log" Nov 29 16:18:37 crc kubenswrapper[4907]: I1129 16:18:37.899934 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-86bbb9c7fb-ldhkh_dceea103-7394-4d01-9168-0b3f5b49306f/manager/0.log" Nov 29 16:18:37 crc kubenswrapper[4907]: I1129 16:18:37.949149 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-dpdcs_45db6747-0449-4839-b0ed-07e930579b83/manager/0.log" Nov 29 16:18:49 crc kubenswrapper[4907]: I1129 16:18:49.480122 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:18:49 crc kubenswrapper[4907]: E1129 16:18:49.480959 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:18:58 crc kubenswrapper[4907]: I1129 16:18:58.561419 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-75hhz_0c5dabfe-62e3-4104-9939-59e4832c6484/control-plane-machine-set-operator/0.log" Nov 29 16:18:58 crc kubenswrapper[4907]: I1129 16:18:58.737041 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jcdhm_d5769473-e380-4d0e-bfe4-aab057473a62/machine-api-operator/0.log" Nov 29 16:18:58 crc kubenswrapper[4907]: I1129 16:18:58.774299 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jcdhm_d5769473-e380-4d0e-bfe4-aab057473a62/kube-rbac-proxy/0.log" Nov 29 16:19:04 crc kubenswrapper[4907]: I1129 16:19:04.480218 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:19:04 crc kubenswrapper[4907]: E1129 16:19:04.481791 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:19:13 crc kubenswrapper[4907]: I1129 16:19:13.230042 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-j4z4g_b2590975-d739-4401-ae0d-8ef8dd6ba179/cert-manager-controller/0.log" Nov 29 16:19:13 crc kubenswrapper[4907]: I1129 16:19:13.344515 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-ps9pn_a1af2f3d-c619-4288-93c6-721cb89dc1cf/cert-manager-cainjector/0.log" Nov 29 16:19:13 crc kubenswrapper[4907]: I1129 16:19:13.423657 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-79pdn_11b5343c-652a-4d02-841e-2373f5b9f0cf/cert-manager-webhook/0.log" Nov 29 16:19:15 crc kubenswrapper[4907]: I1129 16:19:15.479799 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:19:15 crc kubenswrapper[4907]: E1129 16:19:15.480609 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:19:26 crc kubenswrapper[4907]: I1129 16:19:26.479604 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:19:26 crc kubenswrapper[4907]: E1129 16:19:26.480375 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:19:26 crc kubenswrapper[4907]: I1129 16:19:26.944117 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-r4k89_25f5423c-ea17-4f48-9552-7012ca67b559/nmstate-console-plugin/0.log" Nov 29 16:19:27 crc kubenswrapper[4907]: I1129 16:19:27.115628 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-w6v2c_3cc3523e-560b-4af9-a232-0c37f3343fac/nmstate-handler/0.log" Nov 29 16:19:27 crc kubenswrapper[4907]: I1129 16:19:27.162113 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-wrbbp_2ce41195-9d9e-43aa-b0e7-77dbe09cc4cf/kube-rbac-proxy/0.log" Nov 29 16:19:27 crc kubenswrapper[4907]: I1129 16:19:27.182850 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-wrbbp_2ce41195-9d9e-43aa-b0e7-77dbe09cc4cf/nmstate-metrics/0.log" Nov 29 16:19:27 crc kubenswrapper[4907]: I1129 16:19:27.382789 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-p8tcl_19da125c-061c-4051-853f-38e13d9a6d5f/nmstate-operator/0.log" Nov 29 16:19:27 crc kubenswrapper[4907]: I1129 16:19:27.392727 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-s4p8c_b0562e46-01ba-4930-a99f-92771a1804a9/nmstate-webhook/0.log" Nov 29 16:19:37 crc kubenswrapper[4907]: I1129 16:19:37.480298 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:19:37 crc kubenswrapper[4907]: E1129 16:19:37.481709 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:19:42 crc kubenswrapper[4907]: I1129 16:19:42.276116 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6ddbc98977-wnwpz_faed25bd-9bb2-4409-927a-e70521fb534c/kube-rbac-proxy/0.log" Nov 29 16:19:42 crc kubenswrapper[4907]: I1129 16:19:42.305564 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6ddbc98977-wnwpz_faed25bd-9bb2-4409-927a-e70521fb534c/manager/0.log" Nov 29 16:19:49 crc kubenswrapper[4907]: I1129 16:19:49.480118 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:19:49 crc kubenswrapper[4907]: E1129 16:19:49.480793 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:19:59 crc kubenswrapper[4907]: I1129 16:19:59.665185 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-22svz_75775bda-f952-44db-a0c1-01993254453f/cluster-logging-operator/0.log" Nov 29 16:19:59 crc kubenswrapper[4907]: I1129 16:19:59.816329 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-45v85_6b5ac22b-d575-4fa5-b6d2-7c584a7bbb8a/collector/0.log" Nov 29 16:19:59 crc kubenswrapper[4907]: I1129 16:19:59.892200 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_1395c265-394b-4f9d-9bab-ebdff563a7b2/loki-compactor/0.log" Nov 29 16:20:00 crc kubenswrapper[4907]: I1129 16:20:00.022225 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-fxc7c_54d3a7f1-ba2a-4744-937a-4bf219bb85ab/loki-distributor/0.log" Nov 29 16:20:00 crc kubenswrapper[4907]: I1129 16:20:00.118465 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-586bf9b9f5-7865w_53107106-32ab-4c46-949f-094abb62ce68/gateway/0.log" Nov 29 16:20:00 crc kubenswrapper[4907]: I1129 16:20:00.165318 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-586bf9b9f5-7865w_53107106-32ab-4c46-949f-094abb62ce68/opa/0.log" Nov 29 16:20:00 crc kubenswrapper[4907]: I1129 16:20:00.259535 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-586bf9b9f5-wlw8p_fb80f35c-6683-4296-afd9-a0895e860a3d/gateway/0.log" Nov 29 16:20:00 crc kubenswrapper[4907]: I1129 16:20:00.311511 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-586bf9b9f5-wlw8p_fb80f35c-6683-4296-afd9-a0895e860a3d/opa/0.log" Nov 29 16:20:00 crc kubenswrapper[4907]: I1129 16:20:00.444591 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_d24d8eb0-2f0a-41f7-9234-2ae2bab4b191/loki-index-gateway/0.log" Nov 29 16:20:00 crc kubenswrapper[4907]: I1129 16:20:00.576373 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_7e05f45d-0f5c-45a0-81cb-673104c0f806/loki-ingester/0.log" Nov 29 16:20:00 crc kubenswrapper[4907]: I1129 16:20:00.658614 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-dmfpd_53bdaeef-1d57-48e5-8b2d-bc9edacd5351/loki-querier/0.log" Nov 29 16:20:00 crc kubenswrapper[4907]: I1129 16:20:00.808230 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-xpnnl_34885590-0043-44b5-be42-f726d65f8487/loki-query-frontend/0.log" Nov 29 16:20:04 crc kubenswrapper[4907]: I1129 16:20:04.480420 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:20:04 crc kubenswrapper[4907]: E1129 16:20:04.481452 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:20:17 crc kubenswrapper[4907]: I1129 16:20:17.536616 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-bkm82_a13cb44c-0bae-4a00-9f98-ad5c6f3c6660/kube-rbac-proxy/0.log" Nov 29 16:20:17 crc kubenswrapper[4907]: I1129 16:20:17.686696 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-bkm82_a13cb44c-0bae-4a00-9f98-ad5c6f3c6660/controller/0.log" Nov 29 16:20:17 crc kubenswrapper[4907]: I1129 16:20:17.802313 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-frr-files/0.log" Nov 29 16:20:17 crc kubenswrapper[4907]: I1129 16:20:17.973350 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-frr-files/0.log" Nov 29 16:20:17 crc kubenswrapper[4907]: I1129 16:20:17.987396 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-reloader/0.log" Nov 29 16:20:18 crc kubenswrapper[4907]: I1129 16:20:18.013898 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-reloader/0.log" Nov 29 16:20:18 crc kubenswrapper[4907]: I1129 16:20:18.038550 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-metrics/0.log" Nov 29 16:20:18 crc kubenswrapper[4907]: I1129 16:20:18.254577 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-frr-files/0.log" Nov 29 16:20:18 crc kubenswrapper[4907]: I1129 16:20:18.262359 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-reloader/0.log" Nov 29 16:20:18 crc kubenswrapper[4907]: I1129 16:20:18.307230 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-metrics/0.log" Nov 29 16:20:18 crc kubenswrapper[4907]: I1129 16:20:18.351999 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-metrics/0.log" Nov 29 16:20:18 crc kubenswrapper[4907]: I1129 16:20:18.480879 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:20:18 crc kubenswrapper[4907]: E1129 16:20:18.481208 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:20:18 crc kubenswrapper[4907]: I1129 16:20:18.547258 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-frr-files/0.log" Nov 29 16:20:18 crc kubenswrapper[4907]: I1129 16:20:18.547458 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-reloader/0.log" Nov 29 16:20:18 crc kubenswrapper[4907]: I1129 16:20:18.566895 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/cp-metrics/0.log" Nov 29 16:20:18 crc kubenswrapper[4907]: I1129 16:20:18.653566 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/controller/0.log" Nov 29 16:20:18 crc kubenswrapper[4907]: I1129 16:20:18.744174 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/frr-metrics/0.log" Nov 29 16:20:18 crc kubenswrapper[4907]: I1129 16:20:18.820224 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/kube-rbac-proxy/0.log" Nov 29 16:20:18 crc kubenswrapper[4907]: I1129 16:20:18.940104 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/kube-rbac-proxy-frr/0.log" Nov 29 16:20:19 crc kubenswrapper[4907]: I1129 16:20:19.033762 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/reloader/0.log" Nov 29 16:20:19 crc kubenswrapper[4907]: I1129 16:20:19.207153 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-tdzhz_749d56ce-a6c4-4b8f-bd45-0f8a44a9d192/frr-k8s-webhook-server/0.log" Nov 29 16:20:19 crc kubenswrapper[4907]: I1129 16:20:19.495744 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6dfdbf684f-xmtsv_89e52ee4-247d-402e-9c42-8f39e8529314/manager/0.log" Nov 29 16:20:19 crc kubenswrapper[4907]: I1129 16:20:19.593141 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-785f7fb488-nl5qd_008f37e0-a6cb-4202-aed6-fa2b3734e881/webhook-server/0.log" Nov 29 16:20:19 crc kubenswrapper[4907]: I1129 16:20:19.816263 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-csdjw_a0198f8f-d4b9-4452-abda-d3e0df0ec26d/kube-rbac-proxy/0.log" Nov 29 16:20:20 crc kubenswrapper[4907]: I1129 16:20:20.395997 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-84ncb_df794960-249c-4965-814c-36decf5db5d3/frr/0.log" Nov 29 16:20:20 crc kubenswrapper[4907]: I1129 16:20:20.442962 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-csdjw_a0198f8f-d4b9-4452-abda-d3e0df0ec26d/speaker/0.log" Nov 29 16:20:29 crc kubenswrapper[4907]: I1129 16:20:29.480488 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:20:29 crc kubenswrapper[4907]: E1129 16:20:29.483180 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:20:34 crc kubenswrapper[4907]: I1129 16:20:34.064422 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p_d1016dfd-9651-4c1f-94f4-312c4eab6a00/util/0.log" Nov 29 16:20:34 crc kubenswrapper[4907]: I1129 16:20:34.298432 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p_d1016dfd-9651-4c1f-94f4-312c4eab6a00/util/0.log" Nov 29 16:20:34 crc kubenswrapper[4907]: I1129 16:20:34.304821 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p_d1016dfd-9651-4c1f-94f4-312c4eab6a00/pull/0.log" Nov 29 16:20:34 crc kubenswrapper[4907]: I1129 16:20:34.316472 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p_d1016dfd-9651-4c1f-94f4-312c4eab6a00/pull/0.log" Nov 29 16:20:34 crc kubenswrapper[4907]: I1129 16:20:34.469692 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p_d1016dfd-9651-4c1f-94f4-312c4eab6a00/extract/0.log" Nov 29 16:20:34 crc kubenswrapper[4907]: I1129 16:20:34.494395 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p_d1016dfd-9651-4c1f-94f4-312c4eab6a00/pull/0.log" Nov 29 16:20:34 crc kubenswrapper[4907]: I1129 16:20:34.496013 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8w8r6p_d1016dfd-9651-4c1f-94f4-312c4eab6a00/util/0.log" Nov 29 16:20:34 crc kubenswrapper[4907]: I1129 16:20:34.661159 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb_1e95c092-9faa-432a-8f5b-b4a831e12946/util/0.log" Nov 29 16:20:34 crc kubenswrapper[4907]: I1129 16:20:34.839873 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb_1e95c092-9faa-432a-8f5b-b4a831e12946/pull/0.log" Nov 29 16:20:34 crc kubenswrapper[4907]: I1129 16:20:34.848241 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb_1e95c092-9faa-432a-8f5b-b4a831e12946/util/0.log" Nov 29 16:20:34 crc kubenswrapper[4907]: I1129 16:20:34.851621 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb_1e95c092-9faa-432a-8f5b-b4a831e12946/pull/0.log" Nov 29 16:20:35 crc kubenswrapper[4907]: I1129 16:20:35.054800 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb_1e95c092-9faa-432a-8f5b-b4a831e12946/extract/0.log" Nov 29 16:20:35 crc kubenswrapper[4907]: I1129 16:20:35.055356 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb_1e95c092-9faa-432a-8f5b-b4a831e12946/util/0.log" Nov 29 16:20:35 crc kubenswrapper[4907]: I1129 16:20:35.076622 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f2cvwb_1e95c092-9faa-432a-8f5b-b4a831e12946/pull/0.log" Nov 29 16:20:35 crc kubenswrapper[4907]: I1129 16:20:35.250587 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn_90303a73-fb9d-454b-a241-ffacdb554862/util/0.log" Nov 29 16:20:35 crc kubenswrapper[4907]: I1129 16:20:35.423192 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn_90303a73-fb9d-454b-a241-ffacdb554862/pull/0.log" Nov 29 16:20:35 crc kubenswrapper[4907]: I1129 16:20:35.454311 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn_90303a73-fb9d-454b-a241-ffacdb554862/util/0.log" Nov 29 16:20:35 crc kubenswrapper[4907]: I1129 16:20:35.496723 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn_90303a73-fb9d-454b-a241-ffacdb554862/pull/0.log" Nov 29 16:20:35 crc kubenswrapper[4907]: I1129 16:20:35.795764 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn_90303a73-fb9d-454b-a241-ffacdb554862/extract/0.log" Nov 29 16:20:35 crc kubenswrapper[4907]: I1129 16:20:35.816350 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn_90303a73-fb9d-454b-a241-ffacdb554862/pull/0.log" Nov 29 16:20:35 crc kubenswrapper[4907]: I1129 16:20:35.946805 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108zbzn_90303a73-fb9d-454b-a241-ffacdb554862/util/0.log" Nov 29 16:20:36 crc kubenswrapper[4907]: I1129 16:20:36.029536 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g_665b9a04-0d24-45c5-9129-8d37342f2674/util/0.log" Nov 29 16:20:36 crc kubenswrapper[4907]: I1129 16:20:36.196922 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g_665b9a04-0d24-45c5-9129-8d37342f2674/pull/0.log" Nov 29 16:20:36 crc kubenswrapper[4907]: I1129 16:20:36.212709 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g_665b9a04-0d24-45c5-9129-8d37342f2674/util/0.log" Nov 29 16:20:36 crc kubenswrapper[4907]: I1129 16:20:36.262139 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g_665b9a04-0d24-45c5-9129-8d37342f2674/pull/0.log" Nov 29 16:20:36 crc kubenswrapper[4907]: I1129 16:20:36.442332 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g_665b9a04-0d24-45c5-9129-8d37342f2674/extract/0.log" Nov 29 16:20:36 crc kubenswrapper[4907]: I1129 16:20:36.448463 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g_665b9a04-0d24-45c5-9129-8d37342f2674/pull/0.log" Nov 29 16:20:36 crc kubenswrapper[4907]: I1129 16:20:36.463957 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f6n97g_665b9a04-0d24-45c5-9129-8d37342f2674/util/0.log" Nov 29 16:20:36 crc kubenswrapper[4907]: I1129 16:20:36.628209 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk_1bea1f3c-2bd3-4013-a502-9b9ed934f733/util/0.log" Nov 29 16:20:36 crc kubenswrapper[4907]: I1129 16:20:36.812172 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk_1bea1f3c-2bd3-4013-a502-9b9ed934f733/util/0.log" Nov 29 16:20:36 crc kubenswrapper[4907]: I1129 16:20:36.855862 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk_1bea1f3c-2bd3-4013-a502-9b9ed934f733/pull/0.log" Nov 29 16:20:36 crc kubenswrapper[4907]: I1129 16:20:36.862481 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk_1bea1f3c-2bd3-4013-a502-9b9ed934f733/pull/0.log" Nov 29 16:20:37 crc kubenswrapper[4907]: I1129 16:20:37.028029 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk_1bea1f3c-2bd3-4013-a502-9b9ed934f733/util/0.log" Nov 29 16:20:37 crc kubenswrapper[4907]: I1129 16:20:37.069793 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk_1bea1f3c-2bd3-4013-a502-9b9ed934f733/pull/0.log" Nov 29 16:20:37 crc kubenswrapper[4907]: I1129 16:20:37.077695 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tchlk_1bea1f3c-2bd3-4013-a502-9b9ed934f733/extract/0.log" Nov 29 16:20:37 crc kubenswrapper[4907]: I1129 16:20:37.222637 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bzmdm_7ce243ff-d352-42f5-82b7-57f145c149c9/extract-utilities/0.log" Nov 29 16:20:37 crc kubenswrapper[4907]: I1129 16:20:37.512265 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bzmdm_7ce243ff-d352-42f5-82b7-57f145c149c9/extract-utilities/0.log" Nov 29 16:20:37 crc kubenswrapper[4907]: I1129 16:20:37.530014 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bzmdm_7ce243ff-d352-42f5-82b7-57f145c149c9/extract-content/0.log" Nov 29 16:20:37 crc kubenswrapper[4907]: I1129 16:20:37.531635 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bzmdm_7ce243ff-d352-42f5-82b7-57f145c149c9/extract-content/0.log" Nov 29 16:20:37 crc kubenswrapper[4907]: I1129 16:20:37.672463 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bzmdm_7ce243ff-d352-42f5-82b7-57f145c149c9/extract-content/0.log" Nov 29 16:20:37 crc kubenswrapper[4907]: I1129 16:20:37.687141 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bzmdm_7ce243ff-d352-42f5-82b7-57f145c149c9/extract-utilities/0.log" Nov 29 16:20:37 crc kubenswrapper[4907]: I1129 16:20:37.934283 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxvjq_aa2b263a-b49b-4b7e-bcd7-f17b707db54a/extract-utilities/0.log" Nov 29 16:20:38 crc kubenswrapper[4907]: I1129 16:20:38.100306 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxvjq_aa2b263a-b49b-4b7e-bcd7-f17b707db54a/extract-content/0.log" Nov 29 16:20:38 crc kubenswrapper[4907]: I1129 16:20:38.156357 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxvjq_aa2b263a-b49b-4b7e-bcd7-f17b707db54a/extract-content/0.log" Nov 29 16:20:38 crc kubenswrapper[4907]: I1129 16:20:38.182223 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxvjq_aa2b263a-b49b-4b7e-bcd7-f17b707db54a/extract-utilities/0.log" Nov 29 16:20:38 crc kubenswrapper[4907]: I1129 16:20:38.410534 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bzmdm_7ce243ff-d352-42f5-82b7-57f145c149c9/registry-server/0.log" Nov 29 16:20:38 crc kubenswrapper[4907]: I1129 16:20:38.435378 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxvjq_aa2b263a-b49b-4b7e-bcd7-f17b707db54a/extract-content/0.log" Nov 29 16:20:38 crc kubenswrapper[4907]: I1129 16:20:38.449029 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxvjq_aa2b263a-b49b-4b7e-bcd7-f17b707db54a/extract-utilities/0.log" Nov 29 16:20:38 crc kubenswrapper[4907]: I1129 16:20:38.664904 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-95kn8_5171ef24-3274-44d5-8d36-8d5be3534c2a/extract-utilities/0.log" Nov 29 16:20:38 crc kubenswrapper[4907]: I1129 16:20:38.667979 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-ltxlt_7964d25d-6ab7-44e4-9737-41d44ea2a311/marketplace-operator/0.log" Nov 29 16:20:38 crc kubenswrapper[4907]: I1129 16:20:38.772857 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxvjq_aa2b263a-b49b-4b7e-bcd7-f17b707db54a/registry-server/0.log" Nov 29 16:20:38 crc kubenswrapper[4907]: I1129 16:20:38.868210 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-95kn8_5171ef24-3274-44d5-8d36-8d5be3534c2a/extract-utilities/0.log" Nov 29 16:20:38 crc kubenswrapper[4907]: I1129 16:20:38.870512 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-95kn8_5171ef24-3274-44d5-8d36-8d5be3534c2a/extract-content/0.log" Nov 29 16:20:38 crc kubenswrapper[4907]: I1129 16:20:38.882066 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-95kn8_5171ef24-3274-44d5-8d36-8d5be3534c2a/extract-content/0.log" Nov 29 16:20:39 crc kubenswrapper[4907]: I1129 16:20:39.085609 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-95kn8_5171ef24-3274-44d5-8d36-8d5be3534c2a/extract-utilities/0.log" Nov 29 16:20:39 crc kubenswrapper[4907]: I1129 16:20:39.127829 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-95kn8_5171ef24-3274-44d5-8d36-8d5be3534c2a/extract-content/0.log" Nov 29 16:20:39 crc kubenswrapper[4907]: I1129 16:20:39.172967 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-58pkc_7315fc63-0710-4bcc-a67a-6c2c649192d0/extract-utilities/0.log" Nov 29 16:20:39 crc kubenswrapper[4907]: I1129 16:20:39.320385 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-95kn8_5171ef24-3274-44d5-8d36-8d5be3534c2a/registry-server/0.log" Nov 29 16:20:39 crc kubenswrapper[4907]: I1129 16:20:39.327788 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-58pkc_7315fc63-0710-4bcc-a67a-6c2c649192d0/extract-utilities/0.log" Nov 29 16:20:39 crc kubenswrapper[4907]: I1129 16:20:39.335738 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-58pkc_7315fc63-0710-4bcc-a67a-6c2c649192d0/extract-content/0.log" Nov 29 16:20:39 crc kubenswrapper[4907]: I1129 16:20:39.396648 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-58pkc_7315fc63-0710-4bcc-a67a-6c2c649192d0/extract-content/0.log" Nov 29 16:20:39 crc kubenswrapper[4907]: I1129 16:20:39.543217 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-58pkc_7315fc63-0710-4bcc-a67a-6c2c649192d0/extract-utilities/0.log" Nov 29 16:20:39 crc kubenswrapper[4907]: I1129 16:20:39.568764 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-58pkc_7315fc63-0710-4bcc-a67a-6c2c649192d0/extract-content/0.log" Nov 29 16:20:40 crc kubenswrapper[4907]: I1129 16:20:40.425286 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-58pkc_7315fc63-0710-4bcc-a67a-6c2c649192d0/registry-server/0.log" Nov 29 16:20:41 crc kubenswrapper[4907]: I1129 16:20:41.480598 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:20:41 crc kubenswrapper[4907]: E1129 16:20:41.481197 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:20:52 crc kubenswrapper[4907]: I1129 16:20:52.517636 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:20:52 crc kubenswrapper[4907]: E1129 16:20:52.519403 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t4jq9_openshift-machine-config-operator(58e4d8d7-8362-41f0-80eb-c07a9219ffbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" Nov 29 16:20:53 crc kubenswrapper[4907]: I1129 16:20:53.983555 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-jt7c8_5c8cbe86-4142-478f-add6-b7d0baf83de6/prometheus-operator/0.log" Nov 29 16:20:54 crc kubenswrapper[4907]: I1129 16:20:54.143046 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6b7cf5d557-4h9z9_4b633428-8d76-48d9-bde6-b6233e1d7f40/prometheus-operator-admission-webhook/0.log" Nov 29 16:20:54 crc kubenswrapper[4907]: I1129 16:20:54.169868 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6b7cf5d557-vn6jc_21df79d3-1565-4ab3-bdff-8f63941a44f2/prometheus-operator-admission-webhook/0.log" Nov 29 16:20:54 crc kubenswrapper[4907]: I1129 16:20:54.424289 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-sks2s_9e59fbf3-ac79-42b2-84c9-f2afa27c4efb/operator/0.log" Nov 29 16:20:54 crc kubenswrapper[4907]: I1129 16:20:54.426897 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-v6mrf_985db950-2dae-4f2f-8ea4-289b661b1481/observability-ui-dashboards/0.log" Nov 29 16:20:54 crc kubenswrapper[4907]: I1129 16:20:54.619989 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-5dnkd_258d53e6-9789-4a47-8c51-e928f0ad0f6b/perses-operator/0.log" Nov 29 16:21:07 crc kubenswrapper[4907]: I1129 16:21:07.479573 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:21:08 crc kubenswrapper[4907]: I1129 16:21:08.549186 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"5a6b44fadfcf36276f57aa7789ee87e7ad5b553d1a4e3e08fb48118da3fc32ac"} Nov 29 16:21:09 crc kubenswrapper[4907]: I1129 16:21:09.886414 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6ddbc98977-wnwpz_faed25bd-9bb2-4409-927a-e70521fb534c/manager/0.log" Nov 29 16:21:09 crc kubenswrapper[4907]: I1129 16:21:09.899311 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6ddbc98977-wnwpz_faed25bd-9bb2-4409-927a-e70521fb534c/kube-rbac-proxy/0.log" Nov 29 16:21:48 crc kubenswrapper[4907]: I1129 16:21:48.751994 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nrlt7"] Nov 29 16:21:48 crc kubenswrapper[4907]: E1129 16:21:48.753344 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2119e079-794f-479c-a6bc-d0b13b0eb40f" containerName="container-00" Nov 29 16:21:48 crc kubenswrapper[4907]: I1129 16:21:48.753370 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2119e079-794f-479c-a6bc-d0b13b0eb40f" containerName="container-00" Nov 29 16:21:48 crc kubenswrapper[4907]: I1129 16:21:48.753770 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2119e079-794f-479c-a6bc-d0b13b0eb40f" containerName="container-00" Nov 29 16:21:48 crc kubenswrapper[4907]: I1129 16:21:48.756571 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:21:48 crc kubenswrapper[4907]: I1129 16:21:48.769361 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nrlt7"] Nov 29 16:21:48 crc kubenswrapper[4907]: I1129 16:21:48.875725 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43b34c0f-d8b4-42a3-96cb-86381f42b807-catalog-content\") pod \"community-operators-nrlt7\" (UID: \"43b34c0f-d8b4-42a3-96cb-86381f42b807\") " pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:21:48 crc kubenswrapper[4907]: I1129 16:21:48.875766 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7vsh\" (UniqueName: \"kubernetes.io/projected/43b34c0f-d8b4-42a3-96cb-86381f42b807-kube-api-access-m7vsh\") pod \"community-operators-nrlt7\" (UID: \"43b34c0f-d8b4-42a3-96cb-86381f42b807\") " pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:21:48 crc kubenswrapper[4907]: I1129 16:21:48.875884 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43b34c0f-d8b4-42a3-96cb-86381f42b807-utilities\") pod \"community-operators-nrlt7\" (UID: \"43b34c0f-d8b4-42a3-96cb-86381f42b807\") " pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:21:48 crc kubenswrapper[4907]: I1129 16:21:48.980139 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7vsh\" (UniqueName: \"kubernetes.io/projected/43b34c0f-d8b4-42a3-96cb-86381f42b807-kube-api-access-m7vsh\") pod \"community-operators-nrlt7\" (UID: \"43b34c0f-d8b4-42a3-96cb-86381f42b807\") " pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:21:48 crc kubenswrapper[4907]: I1129 16:21:48.980301 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43b34c0f-d8b4-42a3-96cb-86381f42b807-utilities\") pod \"community-operators-nrlt7\" (UID: \"43b34c0f-d8b4-42a3-96cb-86381f42b807\") " pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:21:48 crc kubenswrapper[4907]: I1129 16:21:48.980426 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43b34c0f-d8b4-42a3-96cb-86381f42b807-catalog-content\") pod \"community-operators-nrlt7\" (UID: \"43b34c0f-d8b4-42a3-96cb-86381f42b807\") " pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:21:48 crc kubenswrapper[4907]: I1129 16:21:48.981163 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43b34c0f-d8b4-42a3-96cb-86381f42b807-utilities\") pod \"community-operators-nrlt7\" (UID: \"43b34c0f-d8b4-42a3-96cb-86381f42b807\") " pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:21:48 crc kubenswrapper[4907]: I1129 16:21:48.982662 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43b34c0f-d8b4-42a3-96cb-86381f42b807-catalog-content\") pod \"community-operators-nrlt7\" (UID: \"43b34c0f-d8b4-42a3-96cb-86381f42b807\") " pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:21:49 crc kubenswrapper[4907]: I1129 16:21:49.007456 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7vsh\" (UniqueName: \"kubernetes.io/projected/43b34c0f-d8b4-42a3-96cb-86381f42b807-kube-api-access-m7vsh\") pod \"community-operators-nrlt7\" (UID: \"43b34c0f-d8b4-42a3-96cb-86381f42b807\") " pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:21:49 crc kubenswrapper[4907]: I1129 16:21:49.088872 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:21:49 crc kubenswrapper[4907]: I1129 16:21:49.847952 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nrlt7"] Nov 29 16:21:49 crc kubenswrapper[4907]: I1129 16:21:49.973511 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrlt7" event={"ID":"43b34c0f-d8b4-42a3-96cb-86381f42b807","Type":"ContainerStarted","Data":"d2dbc1b7c3d98af169773fbb168d9b5d1be5cf37f9723613ab30819272134db8"} Nov 29 16:21:50 crc kubenswrapper[4907]: I1129 16:21:50.984949 4907 generic.go:334] "Generic (PLEG): container finished" podID="43b34c0f-d8b4-42a3-96cb-86381f42b807" containerID="7abf7b1aaca1ffede50564955fbf1bc7a264556489dc9c2676158fede8da769e" exitCode=0 Nov 29 16:21:50 crc kubenswrapper[4907]: I1129 16:21:50.985085 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrlt7" event={"ID":"43b34c0f-d8b4-42a3-96cb-86381f42b807","Type":"ContainerDied","Data":"7abf7b1aaca1ffede50564955fbf1bc7a264556489dc9c2676158fede8da769e"} Nov 29 16:21:50 crc kubenswrapper[4907]: I1129 16:21:50.988274 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 16:21:53 crc kubenswrapper[4907]: I1129 16:21:53.018491 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrlt7" event={"ID":"43b34c0f-d8b4-42a3-96cb-86381f42b807","Type":"ContainerDied","Data":"57b42f7129f6c7eb212abf4c590e6be57d977de6bb313c4fcc55b7254a9fcb9d"} Nov 29 16:21:53 crc kubenswrapper[4907]: I1129 16:21:53.018272 4907 generic.go:334] "Generic (PLEG): container finished" podID="43b34c0f-d8b4-42a3-96cb-86381f42b807" containerID="57b42f7129f6c7eb212abf4c590e6be57d977de6bb313c4fcc55b7254a9fcb9d" exitCode=0 Nov 29 16:21:54 crc kubenswrapper[4907]: I1129 16:21:54.033299 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrlt7" event={"ID":"43b34c0f-d8b4-42a3-96cb-86381f42b807","Type":"ContainerStarted","Data":"4edeb4ac30601912daf643d5085404df9703c63a7d45729e0b3fb971899d2553"} Nov 29 16:21:54 crc kubenswrapper[4907]: I1129 16:21:54.067950 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nrlt7" podStartSLOduration=3.616516248 podStartE2EDuration="6.067656451s" podCreationTimestamp="2025-11-29 16:21:48 +0000 UTC" firstStartedPulling="2025-11-29 16:21:50.987555764 +0000 UTC m=+6808.974393416" lastFinishedPulling="2025-11-29 16:21:53.438695967 +0000 UTC m=+6811.425533619" observedRunningTime="2025-11-29 16:21:54.056252406 +0000 UTC m=+6812.043090068" watchObservedRunningTime="2025-11-29 16:21:54.067656451 +0000 UTC m=+6812.054494113" Nov 29 16:21:59 crc kubenswrapper[4907]: I1129 16:21:59.089007 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:21:59 crc kubenswrapper[4907]: I1129 16:21:59.090650 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:21:59 crc kubenswrapper[4907]: I1129 16:21:59.159330 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:22:00 crc kubenswrapper[4907]: I1129 16:22:00.240344 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:22:00 crc kubenswrapper[4907]: I1129 16:22:00.506768 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nrlt7"] Nov 29 16:22:02 crc kubenswrapper[4907]: I1129 16:22:02.138031 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nrlt7" podUID="43b34c0f-d8b4-42a3-96cb-86381f42b807" containerName="registry-server" containerID="cri-o://4edeb4ac30601912daf643d5085404df9703c63a7d45729e0b3fb971899d2553" gracePeriod=2 Nov 29 16:22:02 crc kubenswrapper[4907]: I1129 16:22:02.757375 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:22:02 crc kubenswrapper[4907]: I1129 16:22:02.880893 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43b34c0f-d8b4-42a3-96cb-86381f42b807-utilities\") pod \"43b34c0f-d8b4-42a3-96cb-86381f42b807\" (UID: \"43b34c0f-d8b4-42a3-96cb-86381f42b807\") " Nov 29 16:22:02 crc kubenswrapper[4907]: I1129 16:22:02.881013 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7vsh\" (UniqueName: \"kubernetes.io/projected/43b34c0f-d8b4-42a3-96cb-86381f42b807-kube-api-access-m7vsh\") pod \"43b34c0f-d8b4-42a3-96cb-86381f42b807\" (UID: \"43b34c0f-d8b4-42a3-96cb-86381f42b807\") " Nov 29 16:22:02 crc kubenswrapper[4907]: I1129 16:22:02.881101 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43b34c0f-d8b4-42a3-96cb-86381f42b807-catalog-content\") pod \"43b34c0f-d8b4-42a3-96cb-86381f42b807\" (UID: \"43b34c0f-d8b4-42a3-96cb-86381f42b807\") " Nov 29 16:22:02 crc kubenswrapper[4907]: I1129 16:22:02.882180 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43b34c0f-d8b4-42a3-96cb-86381f42b807-utilities" (OuterVolumeSpecName: "utilities") pod "43b34c0f-d8b4-42a3-96cb-86381f42b807" (UID: "43b34c0f-d8b4-42a3-96cb-86381f42b807"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:22:02 crc kubenswrapper[4907]: I1129 16:22:02.890515 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43b34c0f-d8b4-42a3-96cb-86381f42b807-kube-api-access-m7vsh" (OuterVolumeSpecName: "kube-api-access-m7vsh") pod "43b34c0f-d8b4-42a3-96cb-86381f42b807" (UID: "43b34c0f-d8b4-42a3-96cb-86381f42b807"). InnerVolumeSpecName "kube-api-access-m7vsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:22:02 crc kubenswrapper[4907]: I1129 16:22:02.938798 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43b34c0f-d8b4-42a3-96cb-86381f42b807-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43b34c0f-d8b4-42a3-96cb-86381f42b807" (UID: "43b34c0f-d8b4-42a3-96cb-86381f42b807"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:22:02 crc kubenswrapper[4907]: I1129 16:22:02.983544 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43b34c0f-d8b4-42a3-96cb-86381f42b807-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 16:22:02 crc kubenswrapper[4907]: I1129 16:22:02.983595 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7vsh\" (UniqueName: \"kubernetes.io/projected/43b34c0f-d8b4-42a3-96cb-86381f42b807-kube-api-access-m7vsh\") on node \"crc\" DevicePath \"\"" Nov 29 16:22:02 crc kubenswrapper[4907]: I1129 16:22:02.983614 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43b34c0f-d8b4-42a3-96cb-86381f42b807-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 16:22:03 crc kubenswrapper[4907]: I1129 16:22:03.150228 4907 generic.go:334] "Generic (PLEG): container finished" podID="43b34c0f-d8b4-42a3-96cb-86381f42b807" containerID="4edeb4ac30601912daf643d5085404df9703c63a7d45729e0b3fb971899d2553" exitCode=0 Nov 29 16:22:03 crc kubenswrapper[4907]: I1129 16:22:03.150292 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrlt7" Nov 29 16:22:03 crc kubenswrapper[4907]: I1129 16:22:03.150309 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrlt7" event={"ID":"43b34c0f-d8b4-42a3-96cb-86381f42b807","Type":"ContainerDied","Data":"4edeb4ac30601912daf643d5085404df9703c63a7d45729e0b3fb971899d2553"} Nov 29 16:22:03 crc kubenswrapper[4907]: I1129 16:22:03.150336 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrlt7" event={"ID":"43b34c0f-d8b4-42a3-96cb-86381f42b807","Type":"ContainerDied","Data":"d2dbc1b7c3d98af169773fbb168d9b5d1be5cf37f9723613ab30819272134db8"} Nov 29 16:22:03 crc kubenswrapper[4907]: I1129 16:22:03.150362 4907 scope.go:117] "RemoveContainer" containerID="4edeb4ac30601912daf643d5085404df9703c63a7d45729e0b3fb971899d2553" Nov 29 16:22:03 crc kubenswrapper[4907]: I1129 16:22:03.192251 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nrlt7"] Nov 29 16:22:03 crc kubenswrapper[4907]: I1129 16:22:03.199689 4907 scope.go:117] "RemoveContainer" containerID="57b42f7129f6c7eb212abf4c590e6be57d977de6bb313c4fcc55b7254a9fcb9d" Nov 29 16:22:03 crc kubenswrapper[4907]: I1129 16:22:03.202569 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nrlt7"] Nov 29 16:22:03 crc kubenswrapper[4907]: I1129 16:22:03.222506 4907 scope.go:117] "RemoveContainer" containerID="7abf7b1aaca1ffede50564955fbf1bc7a264556489dc9c2676158fede8da769e" Nov 29 16:22:03 crc kubenswrapper[4907]: I1129 16:22:03.353146 4907 scope.go:117] "RemoveContainer" containerID="4edeb4ac30601912daf643d5085404df9703c63a7d45729e0b3fb971899d2553" Nov 29 16:22:03 crc kubenswrapper[4907]: E1129 16:22:03.353897 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4edeb4ac30601912daf643d5085404df9703c63a7d45729e0b3fb971899d2553\": container with ID starting with 4edeb4ac30601912daf643d5085404df9703c63a7d45729e0b3fb971899d2553 not found: ID does not exist" containerID="4edeb4ac30601912daf643d5085404df9703c63a7d45729e0b3fb971899d2553" Nov 29 16:22:03 crc kubenswrapper[4907]: I1129 16:22:03.353999 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4edeb4ac30601912daf643d5085404df9703c63a7d45729e0b3fb971899d2553"} err="failed to get container status \"4edeb4ac30601912daf643d5085404df9703c63a7d45729e0b3fb971899d2553\": rpc error: code = NotFound desc = could not find container \"4edeb4ac30601912daf643d5085404df9703c63a7d45729e0b3fb971899d2553\": container with ID starting with 4edeb4ac30601912daf643d5085404df9703c63a7d45729e0b3fb971899d2553 not found: ID does not exist" Nov 29 16:22:03 crc kubenswrapper[4907]: I1129 16:22:03.354020 4907 scope.go:117] "RemoveContainer" containerID="57b42f7129f6c7eb212abf4c590e6be57d977de6bb313c4fcc55b7254a9fcb9d" Nov 29 16:22:03 crc kubenswrapper[4907]: E1129 16:22:03.354840 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b42f7129f6c7eb212abf4c590e6be57d977de6bb313c4fcc55b7254a9fcb9d\": container with ID starting with 57b42f7129f6c7eb212abf4c590e6be57d977de6bb313c4fcc55b7254a9fcb9d not found: ID does not exist" containerID="57b42f7129f6c7eb212abf4c590e6be57d977de6bb313c4fcc55b7254a9fcb9d" Nov 29 16:22:03 crc kubenswrapper[4907]: I1129 16:22:03.354893 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b42f7129f6c7eb212abf4c590e6be57d977de6bb313c4fcc55b7254a9fcb9d"} err="failed to get container status \"57b42f7129f6c7eb212abf4c590e6be57d977de6bb313c4fcc55b7254a9fcb9d\": rpc error: code = NotFound desc = could not find container \"57b42f7129f6c7eb212abf4c590e6be57d977de6bb313c4fcc55b7254a9fcb9d\": container with ID starting with 57b42f7129f6c7eb212abf4c590e6be57d977de6bb313c4fcc55b7254a9fcb9d not found: ID does not exist" Nov 29 16:22:03 crc kubenswrapper[4907]: I1129 16:22:03.354921 4907 scope.go:117] "RemoveContainer" containerID="7abf7b1aaca1ffede50564955fbf1bc7a264556489dc9c2676158fede8da769e" Nov 29 16:22:03 crc kubenswrapper[4907]: E1129 16:22:03.355191 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7abf7b1aaca1ffede50564955fbf1bc7a264556489dc9c2676158fede8da769e\": container with ID starting with 7abf7b1aaca1ffede50564955fbf1bc7a264556489dc9c2676158fede8da769e not found: ID does not exist" containerID="7abf7b1aaca1ffede50564955fbf1bc7a264556489dc9c2676158fede8da769e" Nov 29 16:22:03 crc kubenswrapper[4907]: I1129 16:22:03.355217 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7abf7b1aaca1ffede50564955fbf1bc7a264556489dc9c2676158fede8da769e"} err="failed to get container status \"7abf7b1aaca1ffede50564955fbf1bc7a264556489dc9c2676158fede8da769e\": rpc error: code = NotFound desc = could not find container \"7abf7b1aaca1ffede50564955fbf1bc7a264556489dc9c2676158fede8da769e\": container with ID starting with 7abf7b1aaca1ffede50564955fbf1bc7a264556489dc9c2676158fede8da769e not found: ID does not exist" Nov 29 16:22:04 crc kubenswrapper[4907]: I1129 16:22:04.499396 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43b34c0f-d8b4-42a3-96cb-86381f42b807" path="/var/lib/kubelet/pods/43b34c0f-d8b4-42a3-96cb-86381f42b807/volumes" Nov 29 16:22:56 crc kubenswrapper[4907]: I1129 16:22:56.858596 4907 generic.go:334] "Generic (PLEG): container finished" podID="3f55f8cf-3b4e-4370-9980-0958693dc72c" containerID="55578ea83743f1b31699c3689fe6a11811d7a505c74ee9110d9265bb49b19011" exitCode=0 Nov 29 16:22:56 crc kubenswrapper[4907]: I1129 16:22:56.858872 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxjdc/must-gather-lgmwk" event={"ID":"3f55f8cf-3b4e-4370-9980-0958693dc72c","Type":"ContainerDied","Data":"55578ea83743f1b31699c3689fe6a11811d7a505c74ee9110d9265bb49b19011"} Nov 29 16:22:56 crc kubenswrapper[4907]: I1129 16:22:56.861825 4907 scope.go:117] "RemoveContainer" containerID="55578ea83743f1b31699c3689fe6a11811d7a505c74ee9110d9265bb49b19011" Nov 29 16:22:57 crc kubenswrapper[4907]: I1129 16:22:57.643083 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dxjdc_must-gather-lgmwk_3f55f8cf-3b4e-4370-9980-0958693dc72c/gather/0.log" Nov 29 16:23:09 crc kubenswrapper[4907]: I1129 16:23:09.893250 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dxjdc/must-gather-lgmwk"] Nov 29 16:23:09 crc kubenswrapper[4907]: I1129 16:23:09.894099 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-dxjdc/must-gather-lgmwk" podUID="3f55f8cf-3b4e-4370-9980-0958693dc72c" containerName="copy" containerID="cri-o://2018ebe42f9be0d2863275b7dfd8e130ea1c9c8d71b378d9327f9bd755084201" gracePeriod=2 Nov 29 16:23:09 crc kubenswrapper[4907]: I1129 16:23:09.908194 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dxjdc/must-gather-lgmwk"] Nov 29 16:23:10 crc kubenswrapper[4907]: I1129 16:23:10.071951 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dxjdc_must-gather-lgmwk_3f55f8cf-3b4e-4370-9980-0958693dc72c/copy/0.log" Nov 29 16:23:10 crc kubenswrapper[4907]: I1129 16:23:10.073817 4907 generic.go:334] "Generic (PLEG): container finished" podID="3f55f8cf-3b4e-4370-9980-0958693dc72c" containerID="2018ebe42f9be0d2863275b7dfd8e130ea1c9c8d71b378d9327f9bd755084201" exitCode=143 Nov 29 16:23:10 crc kubenswrapper[4907]: I1129 16:23:10.474333 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dxjdc_must-gather-lgmwk_3f55f8cf-3b4e-4370-9980-0958693dc72c/copy/0.log" Nov 29 16:23:10 crc kubenswrapper[4907]: I1129 16:23:10.474963 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/must-gather-lgmwk" Nov 29 16:23:10 crc kubenswrapper[4907]: I1129 16:23:10.651092 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3f55f8cf-3b4e-4370-9980-0958693dc72c-must-gather-output\") pod \"3f55f8cf-3b4e-4370-9980-0958693dc72c\" (UID: \"3f55f8cf-3b4e-4370-9980-0958693dc72c\") " Nov 29 16:23:10 crc kubenswrapper[4907]: I1129 16:23:10.651392 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnlr8\" (UniqueName: \"kubernetes.io/projected/3f55f8cf-3b4e-4370-9980-0958693dc72c-kube-api-access-rnlr8\") pod \"3f55f8cf-3b4e-4370-9980-0958693dc72c\" (UID: \"3f55f8cf-3b4e-4370-9980-0958693dc72c\") " Nov 29 16:23:10 crc kubenswrapper[4907]: I1129 16:23:10.674247 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f55f8cf-3b4e-4370-9980-0958693dc72c-kube-api-access-rnlr8" (OuterVolumeSpecName: "kube-api-access-rnlr8") pod "3f55f8cf-3b4e-4370-9980-0958693dc72c" (UID: "3f55f8cf-3b4e-4370-9980-0958693dc72c"). InnerVolumeSpecName "kube-api-access-rnlr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:23:10 crc kubenswrapper[4907]: I1129 16:23:10.754990 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnlr8\" (UniqueName: \"kubernetes.io/projected/3f55f8cf-3b4e-4370-9980-0958693dc72c-kube-api-access-rnlr8\") on node \"crc\" DevicePath \"\"" Nov 29 16:23:10 crc kubenswrapper[4907]: I1129 16:23:10.833046 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f55f8cf-3b4e-4370-9980-0958693dc72c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3f55f8cf-3b4e-4370-9980-0958693dc72c" (UID: "3f55f8cf-3b4e-4370-9980-0958693dc72c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:23:10 crc kubenswrapper[4907]: I1129 16:23:10.857150 4907 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3f55f8cf-3b4e-4370-9980-0958693dc72c-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 29 16:23:11 crc kubenswrapper[4907]: I1129 16:23:11.085115 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dxjdc_must-gather-lgmwk_3f55f8cf-3b4e-4370-9980-0958693dc72c/copy/0.log" Nov 29 16:23:11 crc kubenswrapper[4907]: I1129 16:23:11.085923 4907 scope.go:117] "RemoveContainer" containerID="2018ebe42f9be0d2863275b7dfd8e130ea1c9c8d71b378d9327f9bd755084201" Nov 29 16:23:11 crc kubenswrapper[4907]: I1129 16:23:11.086004 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxjdc/must-gather-lgmwk" Nov 29 16:23:11 crc kubenswrapper[4907]: I1129 16:23:11.107208 4907 scope.go:117] "RemoveContainer" containerID="55578ea83743f1b31699c3689fe6a11811d7a505c74ee9110d9265bb49b19011" Nov 29 16:23:12 crc kubenswrapper[4907]: I1129 16:23:12.503753 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f55f8cf-3b4e-4370-9980-0958693dc72c" path="/var/lib/kubelet/pods/3f55f8cf-3b4e-4370-9980-0958693dc72c/volumes" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.015650 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fgbnj"] Nov 29 16:23:22 crc kubenswrapper[4907]: E1129 16:23:22.016654 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b34c0f-d8b4-42a3-96cb-86381f42b807" containerName="registry-server" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.016671 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b34c0f-d8b4-42a3-96cb-86381f42b807" containerName="registry-server" Nov 29 16:23:22 crc kubenswrapper[4907]: E1129 16:23:22.016690 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f55f8cf-3b4e-4370-9980-0958693dc72c" containerName="gather" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.016699 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f55f8cf-3b4e-4370-9980-0958693dc72c" containerName="gather" Nov 29 16:23:22 crc kubenswrapper[4907]: E1129 16:23:22.016739 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f55f8cf-3b4e-4370-9980-0958693dc72c" containerName="copy" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.016747 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f55f8cf-3b4e-4370-9980-0958693dc72c" containerName="copy" Nov 29 16:23:22 crc kubenswrapper[4907]: E1129 16:23:22.016769 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b34c0f-d8b4-42a3-96cb-86381f42b807" containerName="extract-content" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.016775 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b34c0f-d8b4-42a3-96cb-86381f42b807" containerName="extract-content" Nov 29 16:23:22 crc kubenswrapper[4907]: E1129 16:23:22.016785 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b34c0f-d8b4-42a3-96cb-86381f42b807" containerName="extract-utilities" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.016791 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b34c0f-d8b4-42a3-96cb-86381f42b807" containerName="extract-utilities" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.017016 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f55f8cf-3b4e-4370-9980-0958693dc72c" containerName="copy" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.017033 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f55f8cf-3b4e-4370-9980-0958693dc72c" containerName="gather" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.017053 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="43b34c0f-d8b4-42a3-96cb-86381f42b807" containerName="registry-server" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.018725 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.030145 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgbnj"] Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.155925 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7klf\" (UniqueName: \"kubernetes.io/projected/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-kube-api-access-m7klf\") pod \"redhat-marketplace-fgbnj\" (UID: \"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd\") " pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.156014 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-catalog-content\") pod \"redhat-marketplace-fgbnj\" (UID: \"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd\") " pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.156073 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-utilities\") pod \"redhat-marketplace-fgbnj\" (UID: \"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd\") " pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.258221 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7klf\" (UniqueName: \"kubernetes.io/projected/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-kube-api-access-m7klf\") pod \"redhat-marketplace-fgbnj\" (UID: \"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd\") " pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.258302 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-catalog-content\") pod \"redhat-marketplace-fgbnj\" (UID: \"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd\") " pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.258351 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-utilities\") pod \"redhat-marketplace-fgbnj\" (UID: \"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd\") " pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.258959 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-catalog-content\") pod \"redhat-marketplace-fgbnj\" (UID: \"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd\") " pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.259009 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-utilities\") pod \"redhat-marketplace-fgbnj\" (UID: \"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd\") " pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.288512 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7klf\" (UniqueName: \"kubernetes.io/projected/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-kube-api-access-m7klf\") pod \"redhat-marketplace-fgbnj\" (UID: \"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd\") " pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.343707 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:22 crc kubenswrapper[4907]: I1129 16:23:22.916964 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgbnj"] Nov 29 16:23:23 crc kubenswrapper[4907]: I1129 16:23:23.241046 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5d6d23d-e9b9-431b-b48e-9685c9cf81fd" containerID="f6b6ba86b4fde932f18d56c972acf5bd0ffb91e2e4fb77b1990f0e800e283395" exitCode=0 Nov 29 16:23:23 crc kubenswrapper[4907]: I1129 16:23:23.241112 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgbnj" event={"ID":"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd","Type":"ContainerDied","Data":"f6b6ba86b4fde932f18d56c972acf5bd0ffb91e2e4fb77b1990f0e800e283395"} Nov 29 16:23:23 crc kubenswrapper[4907]: I1129 16:23:23.241161 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgbnj" event={"ID":"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd","Type":"ContainerStarted","Data":"49e69c9bc35576af34921a2ff31c8795799d6189b92dd47f4848886008eb3568"} Nov 29 16:23:24 crc kubenswrapper[4907]: I1129 16:23:24.251766 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgbnj" event={"ID":"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd","Type":"ContainerStarted","Data":"9b10f0e5b6854d1ca5bed5f1c59766cb1fdd7060c39a53f98ec73bca21bb97b3"} Nov 29 16:23:25 crc kubenswrapper[4907]: I1129 16:23:25.267025 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5d6d23d-e9b9-431b-b48e-9685c9cf81fd" containerID="9b10f0e5b6854d1ca5bed5f1c59766cb1fdd7060c39a53f98ec73bca21bb97b3" exitCode=0 Nov 29 16:23:25 crc kubenswrapper[4907]: I1129 16:23:25.267103 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgbnj" event={"ID":"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd","Type":"ContainerDied","Data":"9b10f0e5b6854d1ca5bed5f1c59766cb1fdd7060c39a53f98ec73bca21bb97b3"} Nov 29 16:23:26 crc kubenswrapper[4907]: I1129 16:23:26.305269 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgbnj" event={"ID":"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd","Type":"ContainerStarted","Data":"a9de0721f09dff0acd5d42b85d647cdbb85c995d67851978adca7ac7a7682884"} Nov 29 16:23:26 crc kubenswrapper[4907]: I1129 16:23:26.344310 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fgbnj" podStartSLOduration=2.647524195 podStartE2EDuration="5.34428255s" podCreationTimestamp="2025-11-29 16:23:21 +0000 UTC" firstStartedPulling="2025-11-29 16:23:23.243204302 +0000 UTC m=+6901.230041954" lastFinishedPulling="2025-11-29 16:23:25.939962637 +0000 UTC m=+6903.926800309" observedRunningTime="2025-11-29 16:23:26.329813349 +0000 UTC m=+6904.316651041" watchObservedRunningTime="2025-11-29 16:23:26.34428255 +0000 UTC m=+6904.331120202" Nov 29 16:23:28 crc kubenswrapper[4907]: I1129 16:23:28.490284 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 16:23:28 crc kubenswrapper[4907]: I1129 16:23:28.490785 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 16:23:32 crc kubenswrapper[4907]: I1129 16:23:32.344477 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:32 crc kubenswrapper[4907]: I1129 16:23:32.344881 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:32 crc kubenswrapper[4907]: I1129 16:23:32.425553 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:32 crc kubenswrapper[4907]: I1129 16:23:32.512532 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:32 crc kubenswrapper[4907]: I1129 16:23:32.655247 4907 scope.go:117] "RemoveContainer" containerID="52d6c1f86f76aac6d5bb798e0ca1c62728cdc22dd537e50e92ab4c72d1f7d37b" Nov 29 16:23:32 crc kubenswrapper[4907]: I1129 16:23:32.695015 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgbnj"] Nov 29 16:23:34 crc kubenswrapper[4907]: I1129 16:23:34.444359 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fgbnj" podUID="e5d6d23d-e9b9-431b-b48e-9685c9cf81fd" containerName="registry-server" containerID="cri-o://a9de0721f09dff0acd5d42b85d647cdbb85c995d67851978adca7ac7a7682884" gracePeriod=2 Nov 29 16:23:34 crc kubenswrapper[4907]: I1129 16:23:34.980826 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.116979 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-utilities\") pod \"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd\" (UID: \"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd\") " Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.117152 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7klf\" (UniqueName: \"kubernetes.io/projected/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-kube-api-access-m7klf\") pod \"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd\" (UID: \"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd\") " Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.117240 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-catalog-content\") pod \"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd\" (UID: \"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd\") " Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.117784 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-utilities" (OuterVolumeSpecName: "utilities") pod "e5d6d23d-e9b9-431b-b48e-9685c9cf81fd" (UID: "e5d6d23d-e9b9-431b-b48e-9685c9cf81fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.129076 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-kube-api-access-m7klf" (OuterVolumeSpecName: "kube-api-access-m7klf") pod "e5d6d23d-e9b9-431b-b48e-9685c9cf81fd" (UID: "e5d6d23d-e9b9-431b-b48e-9685c9cf81fd"). InnerVolumeSpecName "kube-api-access-m7klf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.135665 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5d6d23d-e9b9-431b-b48e-9685c9cf81fd" (UID: "e5d6d23d-e9b9-431b-b48e-9685c9cf81fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.220044 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.220087 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7klf\" (UniqueName: \"kubernetes.io/projected/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-kube-api-access-m7klf\") on node \"crc\" DevicePath \"\"" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.220100 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.464332 4907 generic.go:334] "Generic (PLEG): container finished" podID="e5d6d23d-e9b9-431b-b48e-9685c9cf81fd" containerID="a9de0721f09dff0acd5d42b85d647cdbb85c995d67851978adca7ac7a7682884" exitCode=0 Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.464403 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgbnj" event={"ID":"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd","Type":"ContainerDied","Data":"a9de0721f09dff0acd5d42b85d647cdbb85c995d67851978adca7ac7a7682884"} Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.464513 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgbnj" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.464585 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgbnj" event={"ID":"e5d6d23d-e9b9-431b-b48e-9685c9cf81fd","Type":"ContainerDied","Data":"49e69c9bc35576af34921a2ff31c8795799d6189b92dd47f4848886008eb3568"} Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.464585 4907 scope.go:117] "RemoveContainer" containerID="a9de0721f09dff0acd5d42b85d647cdbb85c995d67851978adca7ac7a7682884" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.501033 4907 scope.go:117] "RemoveContainer" containerID="9b10f0e5b6854d1ca5bed5f1c59766cb1fdd7060c39a53f98ec73bca21bb97b3" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.532713 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgbnj"] Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.548877 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgbnj"] Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.568522 4907 scope.go:117] "RemoveContainer" containerID="f6b6ba86b4fde932f18d56c972acf5bd0ffb91e2e4fb77b1990f0e800e283395" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.643936 4907 scope.go:117] "RemoveContainer" containerID="a9de0721f09dff0acd5d42b85d647cdbb85c995d67851978adca7ac7a7682884" Nov 29 16:23:35 crc kubenswrapper[4907]: E1129 16:23:35.644817 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9de0721f09dff0acd5d42b85d647cdbb85c995d67851978adca7ac7a7682884\": container with ID starting with a9de0721f09dff0acd5d42b85d647cdbb85c995d67851978adca7ac7a7682884 not found: ID does not exist" containerID="a9de0721f09dff0acd5d42b85d647cdbb85c995d67851978adca7ac7a7682884" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.644910 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9de0721f09dff0acd5d42b85d647cdbb85c995d67851978adca7ac7a7682884"} err="failed to get container status \"a9de0721f09dff0acd5d42b85d647cdbb85c995d67851978adca7ac7a7682884\": rpc error: code = NotFound desc = could not find container \"a9de0721f09dff0acd5d42b85d647cdbb85c995d67851978adca7ac7a7682884\": container with ID starting with a9de0721f09dff0acd5d42b85d647cdbb85c995d67851978adca7ac7a7682884 not found: ID does not exist" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.644955 4907 scope.go:117] "RemoveContainer" containerID="9b10f0e5b6854d1ca5bed5f1c59766cb1fdd7060c39a53f98ec73bca21bb97b3" Nov 29 16:23:35 crc kubenswrapper[4907]: E1129 16:23:35.645632 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b10f0e5b6854d1ca5bed5f1c59766cb1fdd7060c39a53f98ec73bca21bb97b3\": container with ID starting with 9b10f0e5b6854d1ca5bed5f1c59766cb1fdd7060c39a53f98ec73bca21bb97b3 not found: ID does not exist" containerID="9b10f0e5b6854d1ca5bed5f1c59766cb1fdd7060c39a53f98ec73bca21bb97b3" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.645724 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b10f0e5b6854d1ca5bed5f1c59766cb1fdd7060c39a53f98ec73bca21bb97b3"} err="failed to get container status \"9b10f0e5b6854d1ca5bed5f1c59766cb1fdd7060c39a53f98ec73bca21bb97b3\": rpc error: code = NotFound desc = could not find container \"9b10f0e5b6854d1ca5bed5f1c59766cb1fdd7060c39a53f98ec73bca21bb97b3\": container with ID starting with 9b10f0e5b6854d1ca5bed5f1c59766cb1fdd7060c39a53f98ec73bca21bb97b3 not found: ID does not exist" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.645798 4907 scope.go:117] "RemoveContainer" containerID="f6b6ba86b4fde932f18d56c972acf5bd0ffb91e2e4fb77b1990f0e800e283395" Nov 29 16:23:35 crc kubenswrapper[4907]: E1129 16:23:35.646315 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b6ba86b4fde932f18d56c972acf5bd0ffb91e2e4fb77b1990f0e800e283395\": container with ID starting with f6b6ba86b4fde932f18d56c972acf5bd0ffb91e2e4fb77b1990f0e800e283395 not found: ID does not exist" containerID="f6b6ba86b4fde932f18d56c972acf5bd0ffb91e2e4fb77b1990f0e800e283395" Nov 29 16:23:35 crc kubenswrapper[4907]: I1129 16:23:35.646357 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b6ba86b4fde932f18d56c972acf5bd0ffb91e2e4fb77b1990f0e800e283395"} err="failed to get container status \"f6b6ba86b4fde932f18d56c972acf5bd0ffb91e2e4fb77b1990f0e800e283395\": rpc error: code = NotFound desc = could not find container \"f6b6ba86b4fde932f18d56c972acf5bd0ffb91e2e4fb77b1990f0e800e283395\": container with ID starting with f6b6ba86b4fde932f18d56c972acf5bd0ffb91e2e4fb77b1990f0e800e283395 not found: ID does not exist" Nov 29 16:23:36 crc kubenswrapper[4907]: I1129 16:23:36.516519 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5d6d23d-e9b9-431b-b48e-9685c9cf81fd" path="/var/lib/kubelet/pods/e5d6d23d-e9b9-431b-b48e-9685c9cf81fd/volumes" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.258560 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l4svc"] Nov 29 16:23:48 crc kubenswrapper[4907]: E1129 16:23:48.259805 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5d6d23d-e9b9-431b-b48e-9685c9cf81fd" containerName="extract-content" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.259823 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d6d23d-e9b9-431b-b48e-9685c9cf81fd" containerName="extract-content" Nov 29 16:23:48 crc kubenswrapper[4907]: E1129 16:23:48.259845 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5d6d23d-e9b9-431b-b48e-9685c9cf81fd" containerName="registry-server" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.259854 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d6d23d-e9b9-431b-b48e-9685c9cf81fd" containerName="registry-server" Nov 29 16:23:48 crc kubenswrapper[4907]: E1129 16:23:48.259906 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5d6d23d-e9b9-431b-b48e-9685c9cf81fd" containerName="extract-utilities" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.259915 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d6d23d-e9b9-431b-b48e-9685c9cf81fd" containerName="extract-utilities" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.260224 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5d6d23d-e9b9-431b-b48e-9685c9cf81fd" containerName="registry-server" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.262492 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.275460 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l4svc"] Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.379898 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d9560a-b4de-4da7-a984-e5d82b024ba5-utilities\") pod \"certified-operators-l4svc\" (UID: \"19d9560a-b4de-4da7-a984-e5d82b024ba5\") " pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.379964 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnhlt\" (UniqueName: \"kubernetes.io/projected/19d9560a-b4de-4da7-a984-e5d82b024ba5-kube-api-access-bnhlt\") pod \"certified-operators-l4svc\" (UID: \"19d9560a-b4de-4da7-a984-e5d82b024ba5\") " pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.380069 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d9560a-b4de-4da7-a984-e5d82b024ba5-catalog-content\") pod \"certified-operators-l4svc\" (UID: \"19d9560a-b4de-4da7-a984-e5d82b024ba5\") " pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.481510 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d9560a-b4de-4da7-a984-e5d82b024ba5-utilities\") pod \"certified-operators-l4svc\" (UID: \"19d9560a-b4de-4da7-a984-e5d82b024ba5\") " pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.481566 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnhlt\" (UniqueName: \"kubernetes.io/projected/19d9560a-b4de-4da7-a984-e5d82b024ba5-kube-api-access-bnhlt\") pod \"certified-operators-l4svc\" (UID: \"19d9560a-b4de-4da7-a984-e5d82b024ba5\") " pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.481647 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d9560a-b4de-4da7-a984-e5d82b024ba5-catalog-content\") pod \"certified-operators-l4svc\" (UID: \"19d9560a-b4de-4da7-a984-e5d82b024ba5\") " pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.482009 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d9560a-b4de-4da7-a984-e5d82b024ba5-utilities\") pod \"certified-operators-l4svc\" (UID: \"19d9560a-b4de-4da7-a984-e5d82b024ba5\") " pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.482151 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d9560a-b4de-4da7-a984-e5d82b024ba5-catalog-content\") pod \"certified-operators-l4svc\" (UID: \"19d9560a-b4de-4da7-a984-e5d82b024ba5\") " pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.505824 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnhlt\" (UniqueName: \"kubernetes.io/projected/19d9560a-b4de-4da7-a984-e5d82b024ba5-kube-api-access-bnhlt\") pod \"certified-operators-l4svc\" (UID: \"19d9560a-b4de-4da7-a984-e5d82b024ba5\") " pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.585207 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:23:48 crc kubenswrapper[4907]: I1129 16:23:48.939936 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l4svc"] Nov 29 16:23:49 crc kubenswrapper[4907]: I1129 16:23:49.658732 4907 generic.go:334] "Generic (PLEG): container finished" podID="19d9560a-b4de-4da7-a984-e5d82b024ba5" containerID="3e64af49d890c83aeb1df284e1f8eee66a96e9f6b09e95f7c3ccd8dab9f2b522" exitCode=0 Nov 29 16:23:49 crc kubenswrapper[4907]: I1129 16:23:49.658803 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l4svc" event={"ID":"19d9560a-b4de-4da7-a984-e5d82b024ba5","Type":"ContainerDied","Data":"3e64af49d890c83aeb1df284e1f8eee66a96e9f6b09e95f7c3ccd8dab9f2b522"} Nov 29 16:23:49 crc kubenswrapper[4907]: I1129 16:23:49.659264 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l4svc" event={"ID":"19d9560a-b4de-4da7-a984-e5d82b024ba5","Type":"ContainerStarted","Data":"7f7bd5a7c9f549379eb6d1b10b53692e82d5bfd24a387451e53b0fcc0c1cd6bc"} Nov 29 16:23:50 crc kubenswrapper[4907]: I1129 16:23:50.674309 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l4svc" event={"ID":"19d9560a-b4de-4da7-a984-e5d82b024ba5","Type":"ContainerStarted","Data":"bc116aad2d92928ba75a2be9df7eddf5e7a2e85e46178b7c53864f2d562965c6"} Nov 29 16:23:51 crc kubenswrapper[4907]: I1129 16:23:51.685644 4907 generic.go:334] "Generic (PLEG): container finished" podID="19d9560a-b4de-4da7-a984-e5d82b024ba5" containerID="bc116aad2d92928ba75a2be9df7eddf5e7a2e85e46178b7c53864f2d562965c6" exitCode=0 Nov 29 16:23:51 crc kubenswrapper[4907]: I1129 16:23:51.685729 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l4svc" event={"ID":"19d9560a-b4de-4da7-a984-e5d82b024ba5","Type":"ContainerDied","Data":"bc116aad2d92928ba75a2be9df7eddf5e7a2e85e46178b7c53864f2d562965c6"} Nov 29 16:23:52 crc kubenswrapper[4907]: I1129 16:23:52.698570 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l4svc" event={"ID":"19d9560a-b4de-4da7-a984-e5d82b024ba5","Type":"ContainerStarted","Data":"436b5634f3eddc70b03e0eff505373a23c6ae61dc788bcea187372896c523ab6"} Nov 29 16:23:52 crc kubenswrapper[4907]: I1129 16:23:52.765268 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l4svc" podStartSLOduration=2.348595965 podStartE2EDuration="4.765249003s" podCreationTimestamp="2025-11-29 16:23:48 +0000 UTC" firstStartedPulling="2025-11-29 16:23:49.661226444 +0000 UTC m=+6927.648064096" lastFinishedPulling="2025-11-29 16:23:52.077879482 +0000 UTC m=+6930.064717134" observedRunningTime="2025-11-29 16:23:52.721485223 +0000 UTC m=+6930.708322875" watchObservedRunningTime="2025-11-29 16:23:52.765249003 +0000 UTC m=+6930.752086655" Nov 29 16:23:58 crc kubenswrapper[4907]: I1129 16:23:58.490002 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 16:23:58 crc kubenswrapper[4907]: I1129 16:23:58.490645 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 16:23:58 crc kubenswrapper[4907]: I1129 16:23:58.585725 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:23:58 crc kubenswrapper[4907]: I1129 16:23:58.585803 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:23:58 crc kubenswrapper[4907]: I1129 16:23:58.653210 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:23:58 crc kubenswrapper[4907]: I1129 16:23:58.825531 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:23:58 crc kubenswrapper[4907]: I1129 16:23:58.893336 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l4svc"] Nov 29 16:24:00 crc kubenswrapper[4907]: I1129 16:24:00.809485 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l4svc" podUID="19d9560a-b4de-4da7-a984-e5d82b024ba5" containerName="registry-server" containerID="cri-o://436b5634f3eddc70b03e0eff505373a23c6ae61dc788bcea187372896c523ab6" gracePeriod=2 Nov 29 16:24:00 crc kubenswrapper[4907]: E1129 16:24:00.979390 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19d9560a_b4de_4da7_a984_e5d82b024ba5.slice/crio-conmon-436b5634f3eddc70b03e0eff505373a23c6ae61dc788bcea187372896c523ab6.scope\": RecentStats: unable to find data in memory cache]" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.434695 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.595911 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnhlt\" (UniqueName: \"kubernetes.io/projected/19d9560a-b4de-4da7-a984-e5d82b024ba5-kube-api-access-bnhlt\") pod \"19d9560a-b4de-4da7-a984-e5d82b024ba5\" (UID: \"19d9560a-b4de-4da7-a984-e5d82b024ba5\") " Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.595967 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d9560a-b4de-4da7-a984-e5d82b024ba5-catalog-content\") pod \"19d9560a-b4de-4da7-a984-e5d82b024ba5\" (UID: \"19d9560a-b4de-4da7-a984-e5d82b024ba5\") " Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.596017 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d9560a-b4de-4da7-a984-e5d82b024ba5-utilities\") pod \"19d9560a-b4de-4da7-a984-e5d82b024ba5\" (UID: \"19d9560a-b4de-4da7-a984-e5d82b024ba5\") " Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.596857 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19d9560a-b4de-4da7-a984-e5d82b024ba5-utilities" (OuterVolumeSpecName: "utilities") pod "19d9560a-b4de-4da7-a984-e5d82b024ba5" (UID: "19d9560a-b4de-4da7-a984-e5d82b024ba5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.605470 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d9560a-b4de-4da7-a984-e5d82b024ba5-kube-api-access-bnhlt" (OuterVolumeSpecName: "kube-api-access-bnhlt") pod "19d9560a-b4de-4da7-a984-e5d82b024ba5" (UID: "19d9560a-b4de-4da7-a984-e5d82b024ba5"). InnerVolumeSpecName "kube-api-access-bnhlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.648419 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19d9560a-b4de-4da7-a984-e5d82b024ba5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19d9560a-b4de-4da7-a984-e5d82b024ba5" (UID: "19d9560a-b4de-4da7-a984-e5d82b024ba5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.699070 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnhlt\" (UniqueName: \"kubernetes.io/projected/19d9560a-b4de-4da7-a984-e5d82b024ba5-kube-api-access-bnhlt\") on node \"crc\" DevicePath \"\"" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.699124 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d9560a-b4de-4da7-a984-e5d82b024ba5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.699136 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d9560a-b4de-4da7-a984-e5d82b024ba5-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.829038 4907 generic.go:334] "Generic (PLEG): container finished" podID="19d9560a-b4de-4da7-a984-e5d82b024ba5" containerID="436b5634f3eddc70b03e0eff505373a23c6ae61dc788bcea187372896c523ab6" exitCode=0 Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.829086 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l4svc" event={"ID":"19d9560a-b4de-4da7-a984-e5d82b024ba5","Type":"ContainerDied","Data":"436b5634f3eddc70b03e0eff505373a23c6ae61dc788bcea187372896c523ab6"} Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.829120 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l4svc" event={"ID":"19d9560a-b4de-4da7-a984-e5d82b024ba5","Type":"ContainerDied","Data":"7f7bd5a7c9f549379eb6d1b10b53692e82d5bfd24a387451e53b0fcc0c1cd6bc"} Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.829123 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l4svc" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.829142 4907 scope.go:117] "RemoveContainer" containerID="436b5634f3eddc70b03e0eff505373a23c6ae61dc788bcea187372896c523ab6" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.868901 4907 scope.go:117] "RemoveContainer" containerID="bc116aad2d92928ba75a2be9df7eddf5e7a2e85e46178b7c53864f2d562965c6" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.873924 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l4svc"] Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.893167 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l4svc"] Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.898190 4907 scope.go:117] "RemoveContainer" containerID="3e64af49d890c83aeb1df284e1f8eee66a96e9f6b09e95f7c3ccd8dab9f2b522" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.947889 4907 scope.go:117] "RemoveContainer" containerID="436b5634f3eddc70b03e0eff505373a23c6ae61dc788bcea187372896c523ab6" Nov 29 16:24:01 crc kubenswrapper[4907]: E1129 16:24:01.960624 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"436b5634f3eddc70b03e0eff505373a23c6ae61dc788bcea187372896c523ab6\": container with ID starting with 436b5634f3eddc70b03e0eff505373a23c6ae61dc788bcea187372896c523ab6 not found: ID does not exist" containerID="436b5634f3eddc70b03e0eff505373a23c6ae61dc788bcea187372896c523ab6" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.960683 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"436b5634f3eddc70b03e0eff505373a23c6ae61dc788bcea187372896c523ab6"} err="failed to get container status \"436b5634f3eddc70b03e0eff505373a23c6ae61dc788bcea187372896c523ab6\": rpc error: code = NotFound desc = could not find container \"436b5634f3eddc70b03e0eff505373a23c6ae61dc788bcea187372896c523ab6\": container with ID starting with 436b5634f3eddc70b03e0eff505373a23c6ae61dc788bcea187372896c523ab6 not found: ID does not exist" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.960712 4907 scope.go:117] "RemoveContainer" containerID="bc116aad2d92928ba75a2be9df7eddf5e7a2e85e46178b7c53864f2d562965c6" Nov 29 16:24:01 crc kubenswrapper[4907]: E1129 16:24:01.961557 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc116aad2d92928ba75a2be9df7eddf5e7a2e85e46178b7c53864f2d562965c6\": container with ID starting with bc116aad2d92928ba75a2be9df7eddf5e7a2e85e46178b7c53864f2d562965c6 not found: ID does not exist" containerID="bc116aad2d92928ba75a2be9df7eddf5e7a2e85e46178b7c53864f2d562965c6" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.961602 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc116aad2d92928ba75a2be9df7eddf5e7a2e85e46178b7c53864f2d562965c6"} err="failed to get container status \"bc116aad2d92928ba75a2be9df7eddf5e7a2e85e46178b7c53864f2d562965c6\": rpc error: code = NotFound desc = could not find container \"bc116aad2d92928ba75a2be9df7eddf5e7a2e85e46178b7c53864f2d562965c6\": container with ID starting with bc116aad2d92928ba75a2be9df7eddf5e7a2e85e46178b7c53864f2d562965c6 not found: ID does not exist" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.961628 4907 scope.go:117] "RemoveContainer" containerID="3e64af49d890c83aeb1df284e1f8eee66a96e9f6b09e95f7c3ccd8dab9f2b522" Nov 29 16:24:01 crc kubenswrapper[4907]: E1129 16:24:01.962356 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e64af49d890c83aeb1df284e1f8eee66a96e9f6b09e95f7c3ccd8dab9f2b522\": container with ID starting with 3e64af49d890c83aeb1df284e1f8eee66a96e9f6b09e95f7c3ccd8dab9f2b522 not found: ID does not exist" containerID="3e64af49d890c83aeb1df284e1f8eee66a96e9f6b09e95f7c3ccd8dab9f2b522" Nov 29 16:24:01 crc kubenswrapper[4907]: I1129 16:24:01.962393 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e64af49d890c83aeb1df284e1f8eee66a96e9f6b09e95f7c3ccd8dab9f2b522"} err="failed to get container status \"3e64af49d890c83aeb1df284e1f8eee66a96e9f6b09e95f7c3ccd8dab9f2b522\": rpc error: code = NotFound desc = could not find container \"3e64af49d890c83aeb1df284e1f8eee66a96e9f6b09e95f7c3ccd8dab9f2b522\": container with ID starting with 3e64af49d890c83aeb1df284e1f8eee66a96e9f6b09e95f7c3ccd8dab9f2b522 not found: ID does not exist" Nov 29 16:24:02 crc kubenswrapper[4907]: I1129 16:24:02.508218 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d9560a-b4de-4da7-a984-e5d82b024ba5" path="/var/lib/kubelet/pods/19d9560a-b4de-4da7-a984-e5d82b024ba5/volumes" Nov 29 16:24:28 crc kubenswrapper[4907]: I1129 16:24:28.491833 4907 patch_prober.go:28] interesting pod/machine-config-daemon-t4jq9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 16:24:28 crc kubenswrapper[4907]: I1129 16:24:28.492469 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 16:24:28 crc kubenswrapper[4907]: I1129 16:24:28.494643 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" Nov 29 16:24:28 crc kubenswrapper[4907]: I1129 16:24:28.496008 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5a6b44fadfcf36276f57aa7789ee87e7ad5b553d1a4e3e08fb48118da3fc32ac"} pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 16:24:28 crc kubenswrapper[4907]: I1129 16:24:28.496088 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" podUID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerName="machine-config-daemon" containerID="cri-o://5a6b44fadfcf36276f57aa7789ee87e7ad5b553d1a4e3e08fb48118da3fc32ac" gracePeriod=600 Nov 29 16:24:29 crc kubenswrapper[4907]: I1129 16:24:29.222164 4907 generic.go:334] "Generic (PLEG): container finished" podID="58e4d8d7-8362-41f0-80eb-c07a9219ffbd" containerID="5a6b44fadfcf36276f57aa7789ee87e7ad5b553d1a4e3e08fb48118da3fc32ac" exitCode=0 Nov 29 16:24:29 crc kubenswrapper[4907]: I1129 16:24:29.222415 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerDied","Data":"5a6b44fadfcf36276f57aa7789ee87e7ad5b553d1a4e3e08fb48118da3fc32ac"} Nov 29 16:24:29 crc kubenswrapper[4907]: I1129 16:24:29.222651 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t4jq9" event={"ID":"58e4d8d7-8362-41f0-80eb-c07a9219ffbd","Type":"ContainerStarted","Data":"e1ca39f27c69b68629397e987d2da202684f15885c0032b86518d0d692b9662b"} Nov 29 16:24:29 crc kubenswrapper[4907]: I1129 16:24:29.222721 4907 scope.go:117] "RemoveContainer" containerID="d0a376400d3090b6ddc7c7461691bf33f2ea3fc15312b3913d22e311312a0af4" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.055130 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w5vj2"] Nov 29 16:24:40 crc kubenswrapper[4907]: E1129 16:24:40.056084 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d9560a-b4de-4da7-a984-e5d82b024ba5" containerName="extract-content" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.056096 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d9560a-b4de-4da7-a984-e5d82b024ba5" containerName="extract-content" Nov 29 16:24:40 crc kubenswrapper[4907]: E1129 16:24:40.056110 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d9560a-b4de-4da7-a984-e5d82b024ba5" containerName="registry-server" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.056116 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d9560a-b4de-4da7-a984-e5d82b024ba5" containerName="registry-server" Nov 29 16:24:40 crc kubenswrapper[4907]: E1129 16:24:40.056125 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d9560a-b4de-4da7-a984-e5d82b024ba5" containerName="extract-utilities" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.056132 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d9560a-b4de-4da7-a984-e5d82b024ba5" containerName="extract-utilities" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.056404 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d9560a-b4de-4da7-a984-e5d82b024ba5" containerName="registry-server" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.058157 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.068354 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5vj2"] Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.137720 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ed86eb-e103-4471-a543-117a0031aef3-catalog-content\") pod \"redhat-operators-w5vj2\" (UID: \"55ed86eb-e103-4471-a543-117a0031aef3\") " pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.138048 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ed86eb-e103-4471-a543-117a0031aef3-utilities\") pod \"redhat-operators-w5vj2\" (UID: \"55ed86eb-e103-4471-a543-117a0031aef3\") " pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.138135 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-662r6\" (UniqueName: \"kubernetes.io/projected/55ed86eb-e103-4471-a543-117a0031aef3-kube-api-access-662r6\") pod \"redhat-operators-w5vj2\" (UID: \"55ed86eb-e103-4471-a543-117a0031aef3\") " pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.240492 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ed86eb-e103-4471-a543-117a0031aef3-utilities\") pod \"redhat-operators-w5vj2\" (UID: \"55ed86eb-e103-4471-a543-117a0031aef3\") " pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.240745 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-662r6\" (UniqueName: \"kubernetes.io/projected/55ed86eb-e103-4471-a543-117a0031aef3-kube-api-access-662r6\") pod \"redhat-operators-w5vj2\" (UID: \"55ed86eb-e103-4471-a543-117a0031aef3\") " pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.240887 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ed86eb-e103-4471-a543-117a0031aef3-utilities\") pod \"redhat-operators-w5vj2\" (UID: \"55ed86eb-e103-4471-a543-117a0031aef3\") " pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.241063 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ed86eb-e103-4471-a543-117a0031aef3-catalog-content\") pod \"redhat-operators-w5vj2\" (UID: \"55ed86eb-e103-4471-a543-117a0031aef3\") " pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.241875 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ed86eb-e103-4471-a543-117a0031aef3-catalog-content\") pod \"redhat-operators-w5vj2\" (UID: \"55ed86eb-e103-4471-a543-117a0031aef3\") " pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.266253 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-662r6\" (UniqueName: \"kubernetes.io/projected/55ed86eb-e103-4471-a543-117a0031aef3-kube-api-access-662r6\") pod \"redhat-operators-w5vj2\" (UID: \"55ed86eb-e103-4471-a543-117a0031aef3\") " pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.398194 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:24:40 crc kubenswrapper[4907]: I1129 16:24:40.958684 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5vj2"] Nov 29 16:24:41 crc kubenswrapper[4907]: I1129 16:24:41.409241 4907 generic.go:334] "Generic (PLEG): container finished" podID="55ed86eb-e103-4471-a543-117a0031aef3" containerID="5f155ee7ddc6b99f7eda885f4203728d14fec91d404ff3d008d25204de615a84" exitCode=0 Nov 29 16:24:41 crc kubenswrapper[4907]: I1129 16:24:41.409305 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5vj2" event={"ID":"55ed86eb-e103-4471-a543-117a0031aef3","Type":"ContainerDied","Data":"5f155ee7ddc6b99f7eda885f4203728d14fec91d404ff3d008d25204de615a84"} Nov 29 16:24:41 crc kubenswrapper[4907]: I1129 16:24:41.409663 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5vj2" event={"ID":"55ed86eb-e103-4471-a543-117a0031aef3","Type":"ContainerStarted","Data":"67712b079869f3665cf2b47e451ce3902a3d7115933ac68d481382d36de22058"} Nov 29 16:24:42 crc kubenswrapper[4907]: I1129 16:24:42.422328 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5vj2" event={"ID":"55ed86eb-e103-4471-a543-117a0031aef3","Type":"ContainerStarted","Data":"c1ba750789e62e9792c13905e6b63e220747e3e8e9604f8673f49d07f438134f"} Nov 29 16:24:45 crc kubenswrapper[4907]: I1129 16:24:45.461037 4907 generic.go:334] "Generic (PLEG): container finished" podID="55ed86eb-e103-4471-a543-117a0031aef3" containerID="c1ba750789e62e9792c13905e6b63e220747e3e8e9604f8673f49d07f438134f" exitCode=0 Nov 29 16:24:45 crc kubenswrapper[4907]: I1129 16:24:45.461115 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5vj2" event={"ID":"55ed86eb-e103-4471-a543-117a0031aef3","Type":"ContainerDied","Data":"c1ba750789e62e9792c13905e6b63e220747e3e8e9604f8673f49d07f438134f"} Nov 29 16:24:46 crc kubenswrapper[4907]: I1129 16:24:46.475942 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5vj2" event={"ID":"55ed86eb-e103-4471-a543-117a0031aef3","Type":"ContainerStarted","Data":"aa43c9fad8cf579fbbebf7ff4aadc112ab709811eec9338894c39346ce45f512"} Nov 29 16:24:50 crc kubenswrapper[4907]: I1129 16:24:50.398771 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:24:50 crc kubenswrapper[4907]: I1129 16:24:50.399268 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:24:51 crc kubenswrapper[4907]: I1129 16:24:51.443684 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w5vj2" podUID="55ed86eb-e103-4471-a543-117a0031aef3" containerName="registry-server" probeResult="failure" output=< Nov 29 16:24:51 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Nov 29 16:24:51 crc kubenswrapper[4907]: > Nov 29 16:25:00 crc kubenswrapper[4907]: I1129 16:25:00.460287 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:25:00 crc kubenswrapper[4907]: I1129 16:25:00.490916 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w5vj2" podStartSLOduration=16.025351542 podStartE2EDuration="20.490890446s" podCreationTimestamp="2025-11-29 16:24:40 +0000 UTC" firstStartedPulling="2025-11-29 16:24:41.41217407 +0000 UTC m=+6979.399011722" lastFinishedPulling="2025-11-29 16:24:45.877712944 +0000 UTC m=+6983.864550626" observedRunningTime="2025-11-29 16:24:46.503105481 +0000 UTC m=+6984.489943143" watchObservedRunningTime="2025-11-29 16:25:00.490890446 +0000 UTC m=+6998.477728118" Nov 29 16:25:00 crc kubenswrapper[4907]: I1129 16:25:00.549149 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:25:00 crc kubenswrapper[4907]: I1129 16:25:00.716106 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5vj2"] Nov 29 16:25:01 crc kubenswrapper[4907]: I1129 16:25:01.660162 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w5vj2" podUID="55ed86eb-e103-4471-a543-117a0031aef3" containerName="registry-server" containerID="cri-o://aa43c9fad8cf579fbbebf7ff4aadc112ab709811eec9338894c39346ce45f512" gracePeriod=2 Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.258310 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.335511 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-662r6\" (UniqueName: \"kubernetes.io/projected/55ed86eb-e103-4471-a543-117a0031aef3-kube-api-access-662r6\") pod \"55ed86eb-e103-4471-a543-117a0031aef3\" (UID: \"55ed86eb-e103-4471-a543-117a0031aef3\") " Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.335638 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ed86eb-e103-4471-a543-117a0031aef3-utilities\") pod \"55ed86eb-e103-4471-a543-117a0031aef3\" (UID: \"55ed86eb-e103-4471-a543-117a0031aef3\") " Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.335809 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ed86eb-e103-4471-a543-117a0031aef3-catalog-content\") pod \"55ed86eb-e103-4471-a543-117a0031aef3\" (UID: \"55ed86eb-e103-4471-a543-117a0031aef3\") " Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.336836 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55ed86eb-e103-4471-a543-117a0031aef3-utilities" (OuterVolumeSpecName: "utilities") pod "55ed86eb-e103-4471-a543-117a0031aef3" (UID: "55ed86eb-e103-4471-a543-117a0031aef3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.346553 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55ed86eb-e103-4471-a543-117a0031aef3-kube-api-access-662r6" (OuterVolumeSpecName: "kube-api-access-662r6") pod "55ed86eb-e103-4471-a543-117a0031aef3" (UID: "55ed86eb-e103-4471-a543-117a0031aef3"). InnerVolumeSpecName "kube-api-access-662r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.440846 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-662r6\" (UniqueName: \"kubernetes.io/projected/55ed86eb-e103-4471-a543-117a0031aef3-kube-api-access-662r6\") on node \"crc\" DevicePath \"\"" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.441140 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ed86eb-e103-4471-a543-117a0031aef3-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.469345 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55ed86eb-e103-4471-a543-117a0031aef3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55ed86eb-e103-4471-a543-117a0031aef3" (UID: "55ed86eb-e103-4471-a543-117a0031aef3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.543130 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ed86eb-e103-4471-a543-117a0031aef3-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.673269 4907 generic.go:334] "Generic (PLEG): container finished" podID="55ed86eb-e103-4471-a543-117a0031aef3" containerID="aa43c9fad8cf579fbbebf7ff4aadc112ab709811eec9338894c39346ce45f512" exitCode=0 Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.673328 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5vj2" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.673339 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5vj2" event={"ID":"55ed86eb-e103-4471-a543-117a0031aef3","Type":"ContainerDied","Data":"aa43c9fad8cf579fbbebf7ff4aadc112ab709811eec9338894c39346ce45f512"} Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.673377 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5vj2" event={"ID":"55ed86eb-e103-4471-a543-117a0031aef3","Type":"ContainerDied","Data":"67712b079869f3665cf2b47e451ce3902a3d7115933ac68d481382d36de22058"} Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.673418 4907 scope.go:117] "RemoveContainer" containerID="aa43c9fad8cf579fbbebf7ff4aadc112ab709811eec9338894c39346ce45f512" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.702286 4907 scope.go:117] "RemoveContainer" containerID="c1ba750789e62e9792c13905e6b63e220747e3e8e9604f8673f49d07f438134f" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.714475 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5vj2"] Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.733579 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w5vj2"] Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.735822 4907 scope.go:117] "RemoveContainer" containerID="5f155ee7ddc6b99f7eda885f4203728d14fec91d404ff3d008d25204de615a84" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.771603 4907 scope.go:117] "RemoveContainer" containerID="aa43c9fad8cf579fbbebf7ff4aadc112ab709811eec9338894c39346ce45f512" Nov 29 16:25:02 crc kubenswrapper[4907]: E1129 16:25:02.772082 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa43c9fad8cf579fbbebf7ff4aadc112ab709811eec9338894c39346ce45f512\": container with ID starting with aa43c9fad8cf579fbbebf7ff4aadc112ab709811eec9338894c39346ce45f512 not found: ID does not exist" containerID="aa43c9fad8cf579fbbebf7ff4aadc112ab709811eec9338894c39346ce45f512" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.772124 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa43c9fad8cf579fbbebf7ff4aadc112ab709811eec9338894c39346ce45f512"} err="failed to get container status \"aa43c9fad8cf579fbbebf7ff4aadc112ab709811eec9338894c39346ce45f512\": rpc error: code = NotFound desc = could not find container \"aa43c9fad8cf579fbbebf7ff4aadc112ab709811eec9338894c39346ce45f512\": container with ID starting with aa43c9fad8cf579fbbebf7ff4aadc112ab709811eec9338894c39346ce45f512 not found: ID does not exist" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.772151 4907 scope.go:117] "RemoveContainer" containerID="c1ba750789e62e9792c13905e6b63e220747e3e8e9604f8673f49d07f438134f" Nov 29 16:25:02 crc kubenswrapper[4907]: E1129 16:25:02.772620 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1ba750789e62e9792c13905e6b63e220747e3e8e9604f8673f49d07f438134f\": container with ID starting with c1ba750789e62e9792c13905e6b63e220747e3e8e9604f8673f49d07f438134f not found: ID does not exist" containerID="c1ba750789e62e9792c13905e6b63e220747e3e8e9604f8673f49d07f438134f" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.772661 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ba750789e62e9792c13905e6b63e220747e3e8e9604f8673f49d07f438134f"} err="failed to get container status \"c1ba750789e62e9792c13905e6b63e220747e3e8e9604f8673f49d07f438134f\": rpc error: code = NotFound desc = could not find container \"c1ba750789e62e9792c13905e6b63e220747e3e8e9604f8673f49d07f438134f\": container with ID starting with c1ba750789e62e9792c13905e6b63e220747e3e8e9604f8673f49d07f438134f not found: ID does not exist" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.772686 4907 scope.go:117] "RemoveContainer" containerID="5f155ee7ddc6b99f7eda885f4203728d14fec91d404ff3d008d25204de615a84" Nov 29 16:25:02 crc kubenswrapper[4907]: E1129 16:25:02.772932 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f155ee7ddc6b99f7eda885f4203728d14fec91d404ff3d008d25204de615a84\": container with ID starting with 5f155ee7ddc6b99f7eda885f4203728d14fec91d404ff3d008d25204de615a84 not found: ID does not exist" containerID="5f155ee7ddc6b99f7eda885f4203728d14fec91d404ff3d008d25204de615a84" Nov 29 16:25:02 crc kubenswrapper[4907]: I1129 16:25:02.772959 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f155ee7ddc6b99f7eda885f4203728d14fec91d404ff3d008d25204de615a84"} err="failed to get container status \"5f155ee7ddc6b99f7eda885f4203728d14fec91d404ff3d008d25204de615a84\": rpc error: code = NotFound desc = could not find container \"5f155ee7ddc6b99f7eda885f4203728d14fec91d404ff3d008d25204de615a84\": container with ID starting with 5f155ee7ddc6b99f7eda885f4203728d14fec91d404ff3d008d25204de615a84 not found: ID does not exist" Nov 29 16:25:04 crc kubenswrapper[4907]: I1129 16:25:04.503258 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55ed86eb-e103-4471-a543-117a0031aef3" path="/var/lib/kubelet/pods/55ed86eb-e103-4471-a543-117a0031aef3/volumes"